Looker BI from the IDE
BuildNewCRM schema discovery to production Looker dashboards — without leaving your IDE. Spec-first provisioning, iterative editing, and programmatic automation replace the traditional multi-month, multi-role BI project.
Timelines are engagement estimates including discovery, pipeline setup, data modeling, and stakeholder review. The CLI itself provisions dashboards in seconds.
The Problem
Starting from scratch
Standing up BI from zero requires data engineers for pipelines, analytics engineers for modeling, and BI developers for dashboards. Three roles working sequentially. Months to first dashboard. Every change goes through the same slow cycle.
Already have Looker
Even orgs with existing Looker instances spend weeks on every new dashboard. Manual tile layout in the GUI, no version control on dashboard definitions, no dry-run preview, changes coupled to a BI admin's calendar.
The Prompt
Paste this into Claude Code with g-gremlin available.
We need production dashboards for the RevOps team: - Renewals pipeline by stage and month - Net retention by cohort - Forecast vs. quota with stage progression - Churn analysis by segment and tenure Data lives in Salesforce. We have Looker in-house. Build the full pipeline and provision the dashboards.
Spec-First Dashboard Provisioning
Dashboards are defined as version-controlled YAML specs — tiles, filters, layout, drill-throughs. The CLI applies these specs through Looker's programmatic API with deterministic upsert logic. --dry-run previews mutations before writing. Merge mode adds without touching unmanaged elements. Replace mode requires explicit --force.
How It Works
Discover Schema
Connect to Salesforce and map objects, fields, relationships, and picklist values.
Build Data Pipeline
Extract from CRM and load into BigQuery with incremental refresh schedules.
Model Analytics Layer
Write SQL transformations: staging, business logic, and metric definitions.
Define Semantic Layer
Generate LookML models with dimensions, measures, explores, and join logic.
Provision Dashboards
Apply YAML specs through Looker API: tiles, filters, layout, drill-throughs.
Iterate in Minutes
Add tiles, modify filters, refine layout — all from the IDE.
"RevOps needs a renewals dashboard in Looker. Let me verify the Looker connection and discover the Salesforce schema."
"Schema mapped, Looker connected, analytics model ready. Provisioning renewals dashboard from YAML spec."
Renewals dashboard live in Looker. 3 tiles, 3 filters. Next: provision remaining 5 dashboards from specs.
Key Capabilities
Progressive Introspection
Discover models, explores, dimensions, and measures. No context-switching to a separate BI tool.
Inline Queries
Run ad-hoc queries against any explore. Output CSV or JSON for validation before building dashboards.
Spec-First Provisioning
Define dashboards as version-controlled YAML specs. Deterministic upsert with --dry-run preview.
Iterative Editing
Add or remove individual tiles and filters on live dashboards. No full redeploy required.
Round-Trip Workflow
Describe an existing dashboard, edit the output, redeploy. No manual format translation.
Safe Mutation Controls
Merge mode preserves unmanaged elements. Replace mode requires explicit --force. Dry-run on everything.
Engagement Timeline
Milestone 1 — One Working Dashboard
- Connect to Salesforce and map the relevant schema
- Stand up the data pipeline (CRM to BigQuery)
- Build the analytics layer for one domain
- Deliver one complete, interactive dashboard in Looker
Milestone 2 — Full Dashboard Suite
- Expand to remaining dashboards
- Add LookML validation and metric regression checks
- Document the change workflow for ongoing iteration
- Stakeholder review and polish rounds
These are engagement timelines including discovery, pipeline setup, data modeling, and stakeholder review. The CLI provisions dashboards in seconds.
Works With Your Existing Stack
CRM Sources
Salesforce, HubSpot, Dynamics 365, or any CRM with API access. g-gremlin handles schema discovery and data extraction.
Data Warehouse
BigQuery as Looker's data source. Data lands in BigQuery via Google's Data Transfer Service or g-gremlin's sink commands.
Visualization
Looker (full Looker with LookML, not Looker Studio). g-gremlin provisions dashboards via Looker's programmatic API.
By the Numbers
Note: Timelines are engagement estimates. Someone still writes YAML specs, defines the data model, and reviews dashboards. The tooling eliminates the dedicated Looker admin role and collapses the analytics engineer + dashboard builder into one workflow.
See It Before You Commit
One working dashboard in your Looker instance. No long-term commitment. No new vendors.