Skip to content

Conversation

@arsenyinfo
Copy link
Contributor

@arsenyinfo arsenyinfo commented Dec 10, 2025

Changes

Add databricks experimental aitools skills subcommand to install Agent Skills for coding agents:

  • skills list - list available skills from the databricks-agent-skills repo
  • skills install [skill-name] - install skills to detected coding agents

Multi-agent support

Automatically detects and installs skills for:

  • Claude Code (~/.claude/skills/)
  • Cursor (~/.cursor/skills/)
  • Codex CLI (~/.codex/skills/)
  • OpenCode (~/.config/opencode/skills/)
  • GitHub Copilot (~/.copilot/skills/)
  • Antigravity (~/.gemini/antigravity/global_skills/)

When multiple agents are detected, skills are installed to a canonical location (~/.databricks/agent-skills/) and symlinked to each agent to avoid duplication.

Refactored agent registry

Unified agent detection logic in lib/agents/agents.go - shared between skills installation and MCP installation commands.

Why

Skills are simpler than MCP for providing domain-specific guidance to AI agents. While MCP requires a running server, skills are static documentation that agents load on demand. This gives users a lightweight way to get Databricks-specific guidance across all major coding agents.

Tests

Manual testing:

databricks experimental aitools skills list
databricks experimental aitools skills install
databricks experimental aitools skills install databricks-apps

@eng-dev-ecosystem-bot
Copy link
Collaborator

eng-dev-ecosystem-bot commented Dec 10, 2025

Commit: e7aa354

Run: 21288994176

Env ❌​FAIL 🟨​KNOWN 🔄​flaky 💚​RECOVERED 🙈​SKIP ✅​pass 🙈​skip Time
🟨​ aws linux 3 5 7 411 700 33:14
🟨​ aws windows 3 5 7 413 698 29:27
❌​ aws-ucws linux 2 5 8 5 596 574 155:44
❌​ aws-ucws windows 2 5 8 5 598 572 145:39
💚​ azure linux 2 9 411 699 32:24
💚​ azure windows 2 9 413 697 29:58
❌​ azure-ucws linux 2 3 3 4 7 589 573 145:54
❌​ azure-ucws windows 2 3 2 4 7 592 571 146:46
💚​ gcp linux 2 9 400 705 33:52
💚​ gcp windows 2 9 402 703 34:54
25 interesting tests: 8 RECOVERED, 5 KNOWN, 5 SKIP, 5 flaky, 2 FAIL
Test Name aws linux aws windows aws-ucws linux aws-ucws windows azure linux azure windows azure-ucws linux azure-ucws windows gcp linux gcp windows
🟨​ TestAccept 🟨​K 🟨​K 🟨​K 🟨​K 💚​R 💚​R 🟨​K 🟨​K 💚​R 💚​R
🙈​ TestAccept/bundle/deployment/bind/alert 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S
🙈​ TestAccept/bundle/generate/alert 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S
🟨​ TestAccept/bundle/invariant/no_drift 🙈​S 🙈​S 🟨​K 🟨​K 🙈​S 🙈​S 🟨​K 🟨​K 🙈​S 🙈​S
🟨​ TestAccept/bundle/invariant/no_drift/DATABRICKS_BUNDLE_ENGINE=direct/INPUT_CONFIG=alert.yml.tmpl 🟨​K 🟨​K 🟨​K 🟨​K
❌​ TestAccept/bundle/invariant/no_drift/DATABRICKS_BUNDLE_ENGINE=direct/INPUT_CONFIG=database_catalog.yml.tmpl ❌​F ❌​F ❌​F ❌​F
❌​ TestAccept/bundle/invariant/no_drift/DATABRICKS_BUNDLE_ENGINE=direct/INPUT_CONFIG=synced_database_table.yml.tmpl ❌​F ❌​F ❌​F ❌​F
🙈​ TestAccept/bundle/resources/alerts/basic 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S
🙈​ TestAccept/bundle/resources/alerts/with_file 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S
🙈​ TestAccept/bundle/resources/permissions 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S
🟨​ TestAccept/bundle/resources/permissions/jobs/destroy_without_mgmtperms/with_permissions 🟨​K 🟨​K 🟨​K 🟨​K 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S
🟨​ TestAccept/bundle/resources/permissions/jobs/destroy_without_mgmtperms/with_permissions/DATABRICKS_BUNDLE_ENGINE=direct 🟨​K 🟨​K 🟨​K 🟨​K
💚​ TestAccept/bundle/resources/permissions/jobs/destroy_without_mgmtperms/with_permissions/DATABRICKS_BUNDLE_ENGINE=terraform 💚​R 💚​R 💚​R 💚​R
💚​ TestAccept/bundle/resources/permissions/jobs/destroy_without_mgmtperms/without_permissions 💚​R 💚​R 💚​R 💚​R 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S
💚​ TestAccept/bundle/resources/permissions/jobs/destroy_without_mgmtperms/without_permissions/DATABRICKS_BUNDLE_ENGINE=direct 💚​R 💚​R 💚​R 💚​R
💚​ TestAccept/bundle/resources/permissions/jobs/destroy_without_mgmtperms/without_permissions/DATABRICKS_BUNDLE_ENGINE=terraform 💚​R 💚​R 💚​R 💚​R
🔄​ TestAccept/bundle/resources/pipelines/update ✅​p ✅​p ✅​p ✅​p ✅​p ✅​p ✅​p 🔄​f ✅​p ✅​p
🔄​ TestAccept/bundle/resources/pipelines/update/DATABRICKS_BUNDLE_ENGINE=direct ✅​p ✅​p ✅​p ✅​p ✅​p ✅​p ✅​p 🔄​f ✅​p ✅​p
🔄​ TestAccept/bundle/resources/quality_monitors/change_output_schema_name 🙈​s 🙈​s ✅​p ✅​p 🙈​s 🙈​s 🔄​f ✅​p 🙈​s 🙈​s
🔄​ TestAccept/bundle/resources/quality_monitors/change_output_schema_name/DATABRICKS_BUNDLE_ENGINE=terraform ✅​p ✅​p 🔄​f ✅​p
🔄​ TestAccept/bundle/resources/quality_monitors/change_table_name 🙈​s 🙈​s ✅​p ✅​p 🙈​s 🙈​s 🔄​f ✅​p 🙈​s 🙈​s
💚​ TestAccept/bundle/resources/synced_database_tables/basic 🙈​S 🙈​S 💚​R 💚​R 🙈​S 🙈​S 💚​R 💚​R 🙈​S 🙈​S
💚​ TestAccept/bundle/resources/synced_database_tables/basic/DATABRICKS_BUNDLE_ENGINE=direct 💚​R 💚​R 💚​R 💚​R
💚​ TestAccept/bundle/resources/synced_database_tables/basic/DATABRICKS_BUNDLE_ENGINE=terraform 💚​R 💚​R 💚​R 💚​R
💚​ TestAccept/ssh/connection 💚​R 💚​R 💚​R 💚​R 💚​R 💚​R 💚​R 💚​R 💚​R 💚​R
Top 50 slowest tests (at least 2 minutes):
duration env testname
9:56 aws-ucws windows TestAccept/bundle/resources/synced_database_tables/basic/DATABRICKS_BUNDLE_ENGINE=terraform
8:00 aws-ucws windows TestAccept/bundle/invariant/no_drift/DATABRICKS_BUNDLE_ENGINE=direct/INPUT_CONFIG=database_instance.yml.tmpl
7:07 azure-ucws linux TestAccept/bundle/resources/registered_models/basic/DATABRICKS_BUNDLE_ENGINE=direct
7:01 azure linux TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=direct
6:38 azure-ucws linux TestAccept/bundle/templates/default-python/combinations/serverless/DATABRICKS_BUNDLE_ENGINE=terraform/DLT=yes/NBOOK=yes/PY=no/READPLAN=
6:30 aws-ucws linux TestAccept/bundle/resources/synced_database_tables/basic/DATABRICKS_BUNDLE_ENGINE=terraform
6:29 aws-ucws windows TestAccept/bundle/resources/synced_database_tables/basic/DATABRICKS_BUNDLE_ENGINE=direct
6:26 azure-ucws linux TestAccept/bundle/templates/default-python/combinations/classic/DATABRICKS_BUNDLE_ENGINE=terraform/DLT=yes/NBOOK=yes/PY=no/READPLAN=
6:13 aws-ucws linux TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=direct
6:12 aws linux TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=direct
6:11 aws-ucws linux TestAccept/bundle/resources/registered_models/basic/DATABRICKS_BUNDLE_ENGINE=terraform
6:08 azure-ucws linux TestAccept/bundle/resources/registered_models/basic/DATABRICKS_BUNDLE_ENGINE=terraform
6:04 aws-ucws windows TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=terraform
5:56 gcp windows TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=terraform
5:47 azure-ucws linux TestAccept/bundle/templates/default-python/combinations/classic/DATABRICKS_BUNDLE_ENGINE=direct/DLT=no/NBOOK=yes/PY=yes/READPLAN=
5:46 aws-ucws linux TestAccept/bundle/resources/synced_database_tables/basic/DATABRICKS_BUNDLE_ENGINE=direct
5:34 gcp windows TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=direct
5:33 aws-ucws linux TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=terraform
5:32 aws-ucws windows TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=direct
5:24 azure-ucws windows TestAccept/bundle/templates/default-python/combinations/classic/DATABRICKS_BUNDLE_ENGINE=direct/DLT=no/NBOOK=yes/PY=yes/READPLAN=
5:19 azure-ucws windows TestAccept/bundle/templates/default-python/combinations/serverless/DATABRICKS_BUNDLE_ENGINE=terraform/DLT=yes/NBOOK=no/PY=yes/READPLAN=
5:13 azure-ucws linux TestAccept/bundle/resources/synced_database_tables/basic/DATABRICKS_BUNDLE_ENGINE=terraform
5:12 azure-ucws linux TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=direct
5:10 azure-ucws windows TestAccept/bundle/resources/registered_models/basic/DATABRICKS_BUNDLE_ENGINE=terraform
5:04 azure-ucws windows TestAccept/bundle/resources/synced_database_tables/basic/DATABRICKS_BUNDLE_ENGINE=terraform
5:03 aws-ucws windows TestAccept/bundle/resources/experiments/basic/DATABRICKS_BUNDLE_ENGINE=terraform
4:57 azure-ucws windows TestAccept/bundle/templates/default-python/combinations/classic/DATABRICKS_BUNDLE_ENGINE=terraform/DLT=no/NBOOK=no/PY=no/READPLAN=
4:56 aws-ucws linux TestAccept/bundle/resources/experiments/basic/DATABRICKS_BUNDLE_ENGINE=direct
4:48 azure-ucws windows TestAccept/bundle/templates/default-python/combinations/serverless/DATABRICKS_BUNDLE_ENGINE=terraform/DLT=no/NBOOK=yes/PY=no/READPLAN=
4:34 azure-ucws linux TestAccept/bundle/resources/secret_scopes/permissions/DATABRICKS_BUNDLE_ENGINE=terraform
4:33 azure-ucws windows TestAccept/bundle/templates/default-python/combinations/serverless/DATABRICKS_BUNDLE_ENGINE=terraform/DLT=no/NBOOK=yes/PY=yes/READPLAN=
4:32 azure-ucws windows TestAccept/bundle/templates/default-python/combinations/serverless/DATABRICKS_BUNDLE_ENGINE=direct/DLT=yes/NBOOK=no/PY=yes/READPLAN=1
4:31 azure-ucws linux TestAccept/bundle/resources/quality_monitors/change_assets_dir/DATABRICKS_BUNDLE_ENGINE=terraform
4:28 azure-ucws linux TestAccept/bundle/resources/synced_database_tables/basic/DATABRICKS_BUNDLE_ENGINE=direct
4:27 azure linux TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=terraform
4:26 aws-ucws windows TestAccept/bundle/templates/default-python/combinations/serverless/DATABRICKS_BUNDLE_ENGINE=direct/DLT=yes/NBOOK=yes/PY=no/READPLAN=1
4:26 azure-ucws windows TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=terraform
4:24 aws-ucws linux TestAccept/bundle/templates/default-python/combinations/classic/DATABRICKS_BUNDLE_ENGINE=direct/DLT=no/NBOOK=no/PY=yes/READPLAN=
4:23 azure-ucws linux TestAccept/bundle/templates/default-python/combinations/serverless/DATABRICKS_BUNDLE_ENGINE=direct/DLT=yes/NBOOK=no/PY=no/READPLAN=1
4:23 aws-ucws linux TestAccept/bundle/templates/default-python/combinations/classic/DATABRICKS_BUNDLE_ENGINE=terraform/DLT=no/NBOOK=no/PY=yes/READPLAN=
4:16 azure-ucws windows TestAccept/bundle/templates/default-python/combinations/classic/DATABRICKS_BUNDLE_ENGINE=direct/DLT=yes/NBOOK=no/PY=no/READPLAN=
4:13 aws-ucws linux TestAccept/bundle/templates/default-python/combinations/serverless/DATABRICKS_BUNDLE_ENGINE=terraform/DLT=no/NBOOK=no/PY=no/READPLAN=
4:10 azure-ucws linux TestAccept/bundle/templates/default-python/combinations/serverless/DATABRICKS_BUNDLE_ENGINE=direct/DLT=yes/NBOOK=yes/PY=yes/READPLAN=1
4:10 aws-ucws windows TestAccept/bundle/resources/experiments/basic/DATABRICKS_BUNDLE_ENGINE=direct
4:10 azure-ucws linux TestAccept/bundle/templates/default-python/combinations/serverless/DATABRICKS_BUNDLE_ENGINE=terraform/DLT=no/NBOOK=yes/PY=yes/READPLAN=
4:09 azure-ucws windows TestAccept/bundle/templates/default-python/combinations/classic/DATABRICKS_BUNDLE_ENGINE=terraform/DLT=no/NBOOK=no/PY=yes/READPLAN=
4:07 aws-ucws windows TestAccept/bundle/templates/default-python/combinations/serverless/DATABRICKS_BUNDLE_ENGINE=direct/DLT=yes/NBOOK=no/PY=yes/READPLAN=1
4:06 azure-ucws windows TestAccept/bundle/templates/default-python/combinations/serverless/DATABRICKS_BUNDLE_ENGINE=terraform/DLT=no/NBOOK=no/PY=yes/READPLAN=
4:06 azure-ucws windows TestAccept/bundle/resources/secret_scopes/permissions/DATABRICKS_BUNDLE_ENGINE=terraform
4:05 azure-ucws windows TestAccept/bundle/templates/default-python/combinations/serverless/DATABRICKS_BUNDLE_ENGINE=terraform/DLT=yes/NBOOK=yes/PY=yes/READPLAN=

@arsenyinfo arsenyinfo changed the title claude skill aitools: add skills install command for Claude Code Jan 13, 2026
Add `databricks experimental aitools skills` subcommand:
- `skills list` - list available skills
- `skills install` - install all skills to ~/.claude/skills/

Includes databricks-apps skill with reference docs synced from
the appkit template.
Skills now fetched from databricks/databricks-agent-skills repo.
- Support 8 agents: Claude Code, Cursor, Windsurf, Cline, Roo Code, Codex CLI, Amp, OpenCode
- Auto-detect installed agents and print which were found
- Use symlinks when multiple agents detected (canonical location: ~/.databricks/agent-skills/)
- Fallback to copy if symlink fails (Windows without admin)
- Support GitHub token from env vars or gh CLI for private repo access

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Create agents.Agent struct with detection, skills dir, and optional MCP install
- Supported agents: Claude Code, Cursor, Codex CLI, OpenCode, GitHub Copilot, Antigravity
- Update skills.go and install.go to use shared registry
- Remove duplicated detection logic from claude.go and cursor.go

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
@arsenyinfo arsenyinfo changed the title aitools: add skills install command for Claude Code aitools: add skills command for coding agents Jan 23, 2026
@arsenyinfo arsenyinfo marked this pull request as ready for review January 23, 2026 14:23
@arsenyinfo arsenyinfo requested review from a team and lennartkats-db as code owners January 23, 2026 14:23
Comment on lines +148 to +150
// use GitHub API for private repo support
url := fmt.Sprintf("https://api.github.com/repos/%s/%s/contents/%s/%s/%s?ref=%s",
skillsRepoOwner, skillsRepoName, skillsRepoPath, skillName, filePath, getSkillsBranch())
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we still use the embedded version of the skills by default? We've had a lot of feedback from enterprise customers in the past about the CLI accessing the open internet. Ideally the CLI works in internet-restricted environments out of the box!

Using the embedded skills also helps coversion the skills so they're always 100% compatible with the present CLI.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@lennartkats-db Great catch on the restricted mode. In my understanding, we can bundle a specific tag of Skills repo - will it work?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes there should be some way to do that. But that doesn't have to be in the V1 of this. I'd just want to go there.

Honestly for the V1 I'd mostly want parity in terms of capabilities of the skills, i.e. have a single "system prompt" and not just apps. That seems like something that's relatively to add again as an afterthought.

Comment on lines +347 to +350
useSymlinks := len(detectedAgents) > 1
var canonicalDir string

if useSymlinks {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are you always relying on symlinks here? We should have some fallback for Windows

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will fix

)

// Agent defines a coding agent that can have skills installed and optionally MCP server.
type Agent struct {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great to have this interface!!

if len(detectedAgents) == 0 {
cmdio.LogString(ctx, color.YellowString("No supported coding agents detected."))
cmdio.LogString(ctx, "")
cmdio.LogString(ctx, "Supported agents: Claude Code, Cursor, Codex CLI, OpenCode, GitHub Copilot, Antigravity")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Woo!

}

// getHomeDir returns home directory, handling Windows USERPROFILE.
func getHomeDir() (string, error) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There should be an existing function like this already


const (
skillsRepoOwner = "databricks"
skillsRepoName = "databricks-agent-skills"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are we ready to switch to this yet? We don't have any Lakeflow skills yet and I don't think it has an equivalent to the databricks_explore tool?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For Apps, migrating MCP => Skill is a top priority atm. As for the databricks_explore, we can migrate to aitools command, so the Lakeflow skills could use it (Not sure if Apps will use it or just databricks skill instead).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants