This is page 2 of 3. Use http://codebase.md/severity1/terraform-cloud-mcp?page={x} to view the full context.
# Directory Structure
```
├── .gitignore
├── .python-version
├── CLAUDE.md
├── docs
│   ├── API_REFERENCES.md
│   ├── CLAUDE.md
│   ├── CONTRIBUTING.md
│   ├── conversations
│   │   ├── account.md
│   │   ├── apply-management-conversation.md
│   │   ├── assessment-results-conversation.md
│   │   ├── cost-estimate-conversation.md
│   │   ├── organization-entitlements-conversation.md
│   │   ├── organizations-management-conversation.md
│   │   ├── plan-management-conversation.md
│   │   ├── project-management-conversation.md
│   │   ├── runs-management-conversation.md
│   │   ├── state_management.md
│   │   ├── variables-conversation.md
│   │   └── workspace-management-conversation.md
│   ├── DEVELOPMENT.md
│   ├── FILTERING_SYSTEM.md
│   ├── models
│   │   ├── account.md
│   │   ├── apply.md
│   │   ├── assessment_result.md
│   │   ├── cost_estimate.md
│   │   ├── organization.md
│   │   ├── plan.md
│   │   ├── project.md
│   │   ├── run.md
│   │   ├── state_version_outputs.md
│   │   ├── state_versions.md
│   │   ├── variables.md
│   │   └── workspace.md
│   ├── README.md
│   └── tools
│       ├── account.md
│       ├── apply.md
│       ├── assessment_results.md
│       ├── cost_estimate.md
│       ├── organization.md
│       ├── plan.md
│       ├── project.md
│       ├── run.md
│       ├── state_version_outputs.md
│       ├── state_versions.md
│       ├── variables.md
│       └── workspace.md
├── env.example
├── LICENSE
├── mypy.ini
├── pyproject.toml
├── README.md
├── terraform_cloud_mcp
│   ├── __init__.py
│   ├── api
│   │   ├── __init__.py
│   │   ├── CLAUDE.md
│   │   └── client.py
│   ├── configs
│   │   ├── __init__.py
│   │   ├── CLAUDE.md
│   │   └── filter_configs.py
│   ├── models
│   │   ├── __init__.py
│   │   ├── account.py
│   │   ├── applies.py
│   │   ├── assessment_results.py
│   │   ├── base.py
│   │   ├── CLAUDE.md
│   │   ├── cost_estimates.py
│   │   ├── filters.py
│   │   ├── organizations.py
│   │   ├── plans.py
│   │   ├── projects.py
│   │   ├── runs.py
│   │   ├── state_version_outputs.py
│   │   ├── state_versions.py
│   │   ├── variables.py
│   │   └── workspaces.py
│   ├── server.py
│   ├── tools
│   │   ├── __init__.py
│   │   ├── account.py
│   │   ├── applies.py
│   │   ├── assessment_results.py
│   │   ├── CLAUDE.md
│   │   ├── cost_estimates.py
│   │   ├── organizations.py
│   │   ├── plans.py
│   │   ├── projects.py
│   │   ├── runs.py
│   │   ├── state_version_outputs.py
│   │   ├── state_versions.py
│   │   ├── variables.py
│   │   └── workspaces.py
│   └── utils
│       ├── __init__.py
│       ├── CLAUDE.md
│       ├── decorators.py
│       ├── env.py
│       ├── filters.py
│       ├── payload.py
│       └── request.py
└── uv.lock
```
# Files
--------------------------------------------------------------------------------
/docs/conversations/project-management-conversation.md:
--------------------------------------------------------------------------------
```markdown
# Project Management Conversation Examples
This document provides example conversations demonstrating how to use the project management tools in the Terraform Cloud MCP.
## Listing Projects in an Organization
```
**User:** Can you list all projects in my organization jptest-org-1?
**Claude:** I'll list the projects in that organization for you.
[Claude uses list_projects tool]
Here are the projects in jptest-org-1:
1. Development Infrastructure (prj-abc123def456)
   - 5 workspaces
   - Created: 2025-01-15
2. Production Systems (prj-ghi789jkl012)
   - 8 workspaces
   - Created: 2024-12-01
3. CI/CD Pipeline (prj-mno345pqr678)
   - 3 workspaces
   - Created: 2025-02-10
Would you like more details on any specific project?
```
## Getting Project Details
```
**User:** I'd like to see details for project prj-abc123def456
**Claude:** I'll retrieve the project details for you.
[Claude uses get_project_details tool]
Here are the details for project "Development Infrastructure" (prj-abc123def456):
Description: Resources for the development environment
Created: 2025-01-15T09:30:22Z
Last updated: 2025-03-20T14:15:43Z
Tags:
- environment: development
- team: platform
- cost-center: 12345
Contains 5 workspaces:
- dev-networking
- dev-compute
- dev-storage
- dev-databases
- dev-monitoring
No auto-destroy schedules are configured for this project.
```
## Creating a New Project
```
**User:** I'd like to create a new project called "QA Environment" in organization jptest-org-1
**Claude:** I'll create that project for you.
[Claude uses create_project tool]
I've created your new project "QA Environment" in jptest-org-1.
Project ID: prj-stu901vwx234
Created: 2025-04-26T15:45:12Z
The project currently has no workspaces associated with it. Would you like to add any workspaces to this project or configure additional settings like tags?
```
## Creating a Project with Detailed Settings
```
**User:** Create a project called "Compliance Systems" in jptest-org-1 with a description, environment tag set to "compliance", and configured to auto-destroy resources after 14 days of inactivity
**Claude:** I'll create that project with your specified settings.
[Claude uses create_project tool]
I've created your new project "Compliance Systems" in jptest-org-1.
Project ID: prj-yz1234abcd56
Created: 2025-04-26T15:48:30Z
Description: Systems for regulatory compliance
Tags:
- environment: compliance
Auto-destroy configured: Resources will be destroyed after 14 days of inactivity.
The project is ready for use. Would you like to add any workspaces to it now?
```
## Updating a Project
```
**User:** I need to update the "Development Infrastructure" project (prj-abc123def456) to change its description and add a new tag
**Claude:** I'll update that project for you.
[Claude uses update_project tool]
I've updated the "Development Infrastructure" project (prj-abc123def456).
The description has been changed to "Development environment resources and infrastructure".
Added tag: priority: medium
The project now has the following tags:
- environment: development
- team: platform
- cost-center: 12345
- priority: medium
All other settings remain unchanged.
```
## Viewing Project Tag Bindings
```
**User:** What tags are applied to the "Production Systems" project (prj-ghi789jkl012)?
**Claude:** I'll check the tags for that project.
[Claude uses list_project_tag_bindings tool]
The "Production Systems" project (prj-ghi789jkl012) has the following tags:
- environment: production
- team: platform
- cost-center: 67890
- criticality: high
- compliance: soc2
- backup-policy: daily
These tags are used for organization, filtering, and governance purposes in your Terraform Cloud organization.
```
## Adding Tags to a Project
```
**User:** I'd like to add two tags to the "CI/CD Pipeline" project (prj-mno345pqr678): category=pipeline and owner=devops-team
**Claude:** I'll add those tags to the project.
[Claude uses add_update_project_tag_bindings tool]
I've added the tags to the "CI/CD Pipeline" project:
Added tags:
- category: pipeline
- owner: devops-team
The project now has a total of 5 tags:
- environment: shared
- team: devops
- category: pipeline
- owner: devops-team
- priority: high
```
## Moving Workspaces to a Project
```
**User:** I need to move the workspaces "test-network" and "test-compute" (IDs: ws-def456ghi789 and ws-jkl012mno345) to the "QA Environment" project (prj-stu901vwx234)
**Claude:** I'll move those workspaces to the QA Environment project.
[Claude uses move_workspaces_to_project tool]
I've successfully moved both workspaces to the "QA Environment" project:
- "test-network" (ws-def456ghi789)
- "test-compute" (ws-jkl012mno345)
These workspaces will now inherit any project-level settings from the QA Environment project, such as tag bindings. The workspaces' individual settings remain unchanged.
The QA Environment project now has 2 workspaces associated with it.
```
## Deleting a Project
```
**User:** I want to delete the project "Compliance Systems" (prj-yz1234abcd56)
**Claude:** I'll delete that project for you.
[Claude uses delete_project tool]
The "Compliance Systems" project (prj-yz1234abcd56) has been successfully deleted.
Note that this only deleted the project itself. Any workspaces that were in this project have been moved back to the organization's default project. No workspaces or infrastructure resources were deleted by this action.
```
## Finding Projects by Search Query
```
**User:** Can you find all projects in jptest-org-1 that have to do with development?
**Claude:** I'll search for development-related projects.
[Claude uses list_projects tool]
I found 2 projects related to "development" in jptest-org-1:
1. Development Infrastructure (prj-abc123def456)
   - 5 workspaces
   - Description: Development environment resources and infrastructure
   - Tags: environment:development, team:platform, cost-center:12345, priority:medium
2. API Development (prj-pqr678stu901)
   - 3 workspaces
   - Description: API development and testing
   - Tags: environment:development, team:api, service:rest
Would you like more details about either of these projects?
```
## Handling Project Errors
```
**User:** Delete project prj-nonexistent123456
**Claude:** I'll try to delete that project.
[Claude uses delete_project tool]
I wasn't able to delete the project. The system returned an error indicating that project ID "prj-nonexistent123456" doesn't exist.
Would you like me to list the available projects in your organization so you can confirm the correct ID?
```
```
--------------------------------------------------------------------------------
/terraform_cloud_mcp/models/filters.py:
--------------------------------------------------------------------------------
```python
"""Filter models for Terraform Cloud MCP.
This module defines Pydantic models and enums for API response filtering,
providing type-safe configuration and validation for filter operations.
Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs
"""
from typing import Set, Optional
from pydantic import Field, field_validator
from enum import Enum
from .base import BaseModelConfig
class OperationType(str, Enum):
    """Operation types for API filtering.
    Defines the type of API operation being performed, which affects
    how responses are filtered:
    - READ: Single resource retrieval operations
    - LIST: Multiple resource listing operations
    - MANAGE: Create, update, delete operations
    See:
        docs/models/filters.md for reference
    """
    READ = "read"
    LIST = "list"
    MANAGE = "manage"
class ResourceType(str, Enum):
    """Known resource types for filtering.
    Defines the Terraform Cloud resource types that have specific
    filtering configurations. Each type has tailored field removal
    rules optimized for token efficiency.
    See:
        docs/models/filters.md for reference
    """
    WORKSPACE = "workspace"
    RUN = "run"
    ORGANIZATION = "organization"
    PROJECT = "project"
    VARIABLE = "variable"
    PLAN = "plan"
    APPLY = "apply"
    STATE_VERSION = "state-version"
    COST_ESTIMATE = "cost-estimate"
    ASSESSMENT = "assessment"
    ACCOUNT = "account"
    GENERIC = "generic"
class FilterConfig(BaseModelConfig):
    """Configuration for filtering a specific resource type.
    Defines which fields should be removed from API responses for different
    operation types to reduce token usage while preserving essential data.
    Attributes:
        always_remove: Fields to always remove regardless of operation type
        read_remove: Additional fields to remove for read operations
        list_remove: Additional fields to remove for list operations
        essential_relationships: Essential relationships to keep for read operations
    Example:
        config = FilterConfig(
            always_remove={"permissions", "actions"},
            list_remove={"created-at", "updated-at"},
            essential_relationships={"organization", "project"}
        )
    See:
        docs/models/filters.md for reference
    """
    always_remove: Set[str] = Field(
        default_factory=set,
        description="Fields to always remove regardless of operation type",
    )
    read_remove: Set[str] = Field(
        default_factory=set,
        description="Additional fields to remove for read operations",
    )
    list_remove: Set[str] = Field(
        default_factory=set,
        description="Additional fields to remove for list operations",
    )
    essential_relationships: Optional[Set[str]] = Field(
        default=None, description="Essential relationships to keep for read operations"
    )
    @field_validator("always_remove", "read_remove", "list_remove")
    @classmethod
    def validate_field_names(cls, v: Set[str]) -> Set[str]:
        """Validate that field names are non-empty strings.
        Args:
            v: Set of field names to validate
        Returns:
            Validated set of field names
        Raises:
            ValueError: If any field names are empty or not strings
        """
        if not v:
            return v
        invalid_fields = {
            field
            for field in v
            if not field or not isinstance(field, str) or not field.strip()
        }
        if invalid_fields:
            raise ValueError(f"Field names must be non-empty strings: {invalid_fields}")
        return v
    @field_validator("essential_relationships")
    @classmethod
    def validate_essential_relationships(
        cls, v: Optional[Set[str]]
    ) -> Optional[Set[str]]:
        """Validate essential relationships if provided.
        Args:
            v: Set of relationship names to validate
        Returns:
            Validated set of relationship names
        Raises:
            ValueError: If any relationship names are empty or not strings
        """
        if v is None:
            return v
        invalid_rels = {
            rel for rel in v if not rel or not isinstance(rel, str) or not rel.strip()
        }
        if invalid_rels:
            raise ValueError(
                f"Relationship names must be non-empty strings: {invalid_rels}"
            )
        return v
class FilterRequest(BaseModelConfig):
    """Request parameters for filtering operations.
    Provides a structured way to specify filtering parameters for API responses,
    including resource type, operation type, and custom field specifications.
    Attributes:
        resource_type: Type of resource being filtered
        operation_type: Type of operation being performed
        custom_fields: Optional custom fields to remove beyond default configuration
        preserve_fields: Optional fields to preserve even if normally filtered
    Example:
        request = FilterRequest(
            resource_type=ResourceType.WORKSPACE,
            operation_type=OperationType.LIST,
            custom_fields={"debug-info", "verbose-logs"}
        )
    See:
        docs/models/filters.md for reference
    """
    resource_type: ResourceType = Field(description="Type of resource being filtered")
    operation_type: OperationType = Field(
        default=OperationType.READ, description="Type of operation being performed"
    )
    custom_fields: Optional[Set[str]] = Field(
        default=None,
        description="Optional custom fields to remove beyond default configuration",
    )
    preserve_fields: Optional[Set[str]] = Field(
        default=None,
        description="Optional fields to preserve even if normally filtered",
    )
    @field_validator("custom_fields", "preserve_fields")
    @classmethod
    def validate_custom_fields(cls, v: Optional[Set[str]]) -> Optional[Set[str]]:
        """Validate custom field specifications.
        Args:
            v: Set of custom field names to validate
        Returns:
            Validated set of custom field names
        Raises:
            ValueError: If any field names are empty or not strings
        """
        if v is None:
            return v
        invalid_fields = {
            field
            for field in v
            if not field or not isinstance(field, str) or not field.strip()
        }
        if invalid_fields:
            raise ValueError(
                f"Custom field names must be non-empty strings: {invalid_fields}"
            )
        return v
```
--------------------------------------------------------------------------------
/terraform_cloud_mcp/api/client.py:
--------------------------------------------------------------------------------
```python
"""Terraform Cloud API client"""
import logging
from typing import Optional, Dict, TypeVar, Union, Any
import httpx
from pydantic import BaseModel
from ..utils.env import get_tfc_token, should_return_raw_response, get_tfc_address
from ..utils.filters import (
    get_response_filter,
    should_filter_response,
    detect_resource_type,
    detect_operation_type,
)
DEFAULT_TOKEN = get_tfc_token()
logger = logging.getLogger(__name__)
if DEFAULT_TOKEN:
    logger.info("Default token provided (masked for security)")
# Type variable for generic request models
ReqT = TypeVar("ReqT", bound=BaseModel)
async def api_request(
    path: str,
    method: str = "GET",
    token: Optional[str] = None,
    params: Dict[str, Any] = {},
    data: Union[Dict[str, Any], BaseModel] = {},
    external_url: bool = False,
    accept_text: bool = False,
    raw_response: Optional[bool] = None,
) -> Dict[str, Any]:
    """Make a request to the Terraform Cloud API with proper error handling."""
    token = token or DEFAULT_TOKEN
    if not token:
        return {
            "error": "Token is required. Please set the TFC_TOKEN environment variable."
        }
    # Convert Pydantic models to dict
    request_data = (
        data.model_dump(exclude_unset=True) if isinstance(data, BaseModel) else data
    )
    headers = {
        "Authorization": f"Bearer {token}",
        "Content-Type": "application/vnd.api+json",
    }
    async with httpx.AsyncClient(follow_redirects=False) as client:
        tfc_address = get_tfc_address()
        url = path if external_url else f"{tfc_address}/api/v2/{path}"
        kwargs: Dict[str, Any] = {"headers": headers, "params": params}
        if request_data:
            kwargs["json"] = request_data
        try:
            response = await client.request(method, url, **kwargs)
            # Handle redirects manually
            if response.status_code in (301, 302, 307, 308):
                location = response.headers.get("Location")
                if not location:
                    return {
                        "error": "Redirect received, but no Location header provided."
                    }
                return await handle_redirect(
                    location, headers, client, accept_text, path, method, raw_response
                )
            # For text responses
            if accept_text:
                return {"content": response.text}
            # Handle 204 No Content responses
            if response.status_code == 204:
                return {"status": "success", "status_code": 204}
            # Handle other success responses
            json_data = response.json()
            # Ensure we return a dict as specified in the function signature
            if not isinstance(json_data, dict):
                json_data = {"data": json_data}
            # Apply response filtering if not disabled
            return _apply_response_filtering(json_data, path, method, raw_response)
        except httpx.RequestError as e:
            logger.error(f"Network error while making request to {url}: {e}")
            return {"error": f"Network error: {str(e)}"}
        except ValueError as e:
            if accept_text and "response" in locals():
                return {"content": response.text}
            logger.error(f"Failed to parse JSON response from {url}: {e}")
            return {"error": f"Failed to parse JSON response: {str(e)}"}
        except Exception as e:
            logger.error(f"Unexpected error while making request to {url}: {e}")
            return {"error": f"Unexpected error: {str(e)}"}
async def handle_redirect(
    location: str,
    headers: Dict[str, str],
    client: httpx.AsyncClient,
    accept_text: bool = False,
    original_path: str = "",
    original_method: str = "GET",
    raw_response: Optional[bool] = None,
) -> Dict[str, Any]:
    """Handle redirects manually, ensuring headers are forwarded."""
    try:
        response = await client.get(location, headers=headers)
        if 200 <= response.status_code < 300:
            # For text responses
            if accept_text:
                return {"content": response.text}
            # Parse the response as JSON and ensure it is a dictionary
            json_data = response.json()
            if not isinstance(json_data, dict):
                json_data = {"data": json_data}
            # Apply response filtering if not disabled
            return _apply_response_filtering(
                json_data, original_path, original_method, raw_response
            )
        return {
            "error": f"Redirect request failed: {response.status_code}",
            "redirect_url": location,
        }
    except httpx.RequestError as e:
        return {
            "error": f"Failed to follow redirect due to network error: {str(e)}",
            "redirect_url": location,
        }
    except ValueError as e:
        # Try returning text content if we're expecting text
        if accept_text and "response" in locals():
            return {"content": response.text}
        return {
            "error": f"Failed to parse JSON response: {str(e)}",
            "redirect_url": location,
        }
    except Exception as e:
        return {
            "error": f"Unexpected error while following redirect: {str(e)}",
            "redirect_url": location,
        }
def _apply_response_filtering(
    json_data: Dict[str, Any],
    path: str,
    method: str,
    raw_response: Optional[bool] = None,
) -> Dict[str, Any]:
    """Apply response filtering based on configuration and request context."""
    # Check if raw response is requested
    if raw_response is True or (raw_response is None and should_return_raw_response()):
        return json_data
    # Check if this response should be filtered
    if not should_filter_response(path, method):
        return json_data
    try:
        # Detect resource type and operation type
        resource_type = detect_resource_type(path, json_data)
        operation_type = detect_operation_type(path, method)
        # Get and apply the appropriate filter
        filter_func = get_response_filter(resource_type)
        filtered_data = filter_func(json_data, operation_type)
        logger.info(
            f"Applied {resource_type} filter ({filter_func.__name__}) for {operation_type} operation on {path}"
        )
        return dict(filtered_data)
    except Exception as e:
        # If filtering fails, log error and return raw data
        logger.warning(
            f"Response filtering failed for {path}: {e}. Returning raw response."
        )
        return json_data
```
--------------------------------------------------------------------------------
/docs/tools/project.md:
--------------------------------------------------------------------------------
```markdown
# Project Tools
This module provides tools for managing projects in Terraform Cloud.
## Overview
Projects in Terraform Cloud are containers for workspaces that help organize them into logical groups. These tools allow you to create, read, update, and delete projects, as well as manage tag bindings and move workspaces between projects.
## API Reference
These tools interact with the Terraform Cloud Projects API:
- [Projects API Documentation](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/projects)
- [Projects Guide](https://developer.hashicorp.com/terraform/cloud-docs/projects)
## Tools Reference
### list_projects
**Function:** `list_projects(organization: str, page_number: int = 1, page_size: int = 20, q: Optional[str] = None, filter_names: Optional[str] = None, filter_permissions_update: Optional[bool] = None, filter_permissions_create_workspace: Optional[bool] = None, sort: Optional[str] = None) -> Dict[str, Any]`
**Description:** Retrieves a paginated list of projects in an organization.
**Parameters:**
- `organization` (str): The organization name
- `page_number` (int, optional): Page number to fetch (default: 1)
- `page_size` (int, optional): Number of results per page (default: 20, max: 100)
- `q` (str, optional): Search query to filter projects by name
- `filter_names` (str, optional): Filter projects by name (comma-separated list)
- `filter_permissions_update` (bool, optional): Filter projects that the user can update
- `filter_permissions_create_workspace` (bool, optional): Filter projects that the user can create workspaces in
- `sort` (str, optional): Sort projects by name ('name' or '-name' for descending)
**Returns:** JSON response containing paginated list of projects with their configuration and metadata.
**Notes:**
- Requires "list projects" permission on the organization
- Use the search parameter for partial name matches
- Permissions filters are useful for determining which projects you can modify
### get_project_details
**Function:** `get_project_details(project_id: str) -> Dict[str, Any]`
**Description:** Retrieves comprehensive information about a specific project.
**Parameters:**
- `project_id` (str): The ID of the project to retrieve details for (format: "prj-xxxxxxxx")
**Returns:** JSON response with project details including:
- Name and description
- Creation and update timestamps
- Auto-destroy activity duration settings
- Tag bindings
- Workspace count
**Notes:**
- Requires "show project" permission
- Essential for retrieving tag bindings and workspace counts
### create_project
**Function:** `create_project(organization: str, name: str, params: Optional[ProjectParams] = None) -> Dict[str, Any]`
**Description:** Creates a new project in an organization.
**Parameters:**
- `organization` (str): The organization name
- `name` (str): The name for the new project
- `params` (ProjectParams, optional): Additional configuration options:
  - `description`: Human-readable description of the project
  - `auto_destroy_activity_duration`: How long each workspace should wait before auto-destroying
  - `tag_bindings`: List of tag key-value pairs to bind to the project
**Returns:** JSON response with the created project details.
**Notes:**
- Requires "create projects" permission on the organization
- Project names must be unique within an organization
- Tags bound to the project are inherited by workspaces within the project
### update_project
**Function:** `update_project(project_id: str, params: Optional[ProjectParams] = None) -> Dict[str, Any]`
**Description:** Updates an existing project's settings.
**Parameters:**
- `project_id` (str): The ID of the project to update
- `params` (ProjectParams, optional): Settings to update:
  - `name`: New name for the project
  - `description`: Human-readable description
  - `auto_destroy_activity_duration`: How long each workspace should wait before auto-destroying
**Returns:** JSON response with the updated project details.
**Notes:**
- Requires "update project" permission
- Only specified attributes will be updated
- Does not update tag bindings directly (use add_update_project_tag_bindings)
### delete_project
**Function:** `delete_project(project_id: str) -> Dict[str, Any]`
**Description:** Permanently deletes a project.
**Parameters:**
- `project_id` (str): The ID of the project to delete
**Returns:** Empty response with HTTP 204 status code if successful.
**Notes:**
- Requires "delete project" permission
- Will fail if the project contains any workspaces or stacks
- Move or delete workspaces first before deleting a project
### list_project_tag_bindings
**Function:** `list_project_tag_bindings(project_id: str) -> Dict[str, Any]`
**Description:** Lists all tags bound to a specific project.
**Parameters:**
- `project_id` (str): The ID of the project
**Returns:** JSON response with list of tag bindings including key-value pairs.
**Notes:**
- Requires "show project" permission
- Project tag bindings are inherited by all workspaces within the project
- Useful for understanding which tags will be applied to workspaces
### add_update_project_tag_bindings
**Function:** `add_update_project_tag_bindings(project_id: str, tag_bindings: List[TagBinding]) -> Dict[str, Any]`
**Description:** Adds or updates tag bindings on a project.
**Parameters:**
- `project_id` (str): The ID of the project
- `tag_bindings`: List of TagBinding objects with key-value pairs
**Returns:** JSON response with the complete list of updated tag bindings.
**Notes:**
- Requires "update project" permission
- This is an additive operation (doesn't remove existing tags)
- If a key already exists, its value will be updated
- Tags are automatically propagated to all workspaces in the project
### move_workspaces_to_project
**Function:** `move_workspaces_to_project(project_id: str, workspace_ids: List[str]) -> Dict[str, Any]`
**Description:** Moves one or more workspaces into a project.
**Parameters:**
- `project_id` (str): The ID of the destination project
- `workspace_ids`: List of workspace IDs to move
**Returns:** Empty response with HTTP 204 status code if successful.
**Notes:**
- Requires permission to move workspaces on both source and destination projects
- Workspaces will inherit tags from the destination project
- Useful for reorganizing workspaces between projects
**Common Error Scenarios:**
| Error | Cause | Solution |
|-------|-------|----------|
| 404 | Project not found | Verify the project ID |
| 403 | Insufficient permissions | Ensure you have proper permissions on projects |
| 422 | Project contains workspaces | Move workspaces out before deleting |
| 422 | Duplicate tag keys | Remove duplicates from tag binding list |
```
--------------------------------------------------------------------------------
/docs/models/project.md:
--------------------------------------------------------------------------------
```markdown
# Project Models
This document describes the data models used for project operations in Terraform Cloud.
## Overview
Project models provide structure and validation for interacting with the Terraform Cloud Projects API. These models define project configuration options, tag bindings, and other settings that can be applied to projects and inherited by their workspaces.
## Models Reference
### TagBinding
**Type:** Object
**Description:** Tag binding configuration for a project. Defines a tag key-value pair that can be bound to a project and inherited by its workspaces.
**Fields:**
- `key` (string, required): The key of the tag
- `value` (string, required): The value of the tag
**JSON representation:**
```json
{
  "key": "environment",
  "value": "production"
}
```
**Usage Context:**
Tag bindings can be applied to projects and are inherited by all workspaces within those projects. They provide a way to consistently categorize and organize workspaces that share common characteristics.
### ProjectParams
**Type:** Object
**Description:** Parameters for project operations without routing fields. Used to specify configuration options when creating or updating projects.
**Fields:**
- `name` (string, optional): Name of the project
- `description` (string, optional): Human-readable description of the project
- `auto_destroy_activity_duration` (string, optional): How long each workspace should wait before auto-destroying (e.g., '14d', '24h')
- `tag_bindings` (List[TagBinding], optional): List of tag key-value pairs to bind to the project
**JSON representation:**
```json
{
  "data": {
    "type": "projects",
    "attributes": {
      "name": "Production Infrastructure",
      "description": "Production environment resources",
      "auto-destroy-activity-duration": "14d"
    }
  }
}
```
**Notes:**
- Field names in JSON use kebab-case format (e.g., "auto-destroy-activity-duration")
- Field names in the model use snake_case format (e.g., auto_destroy_activity_duration)
- All fields are optional when updating but name is required when creating
### ProjectListRequest
**Type:** Request Validation Model
**Description:** Used to validate parameters for listing projects.
**Fields:**
- `organization` (string, required): The name of the organization to list projects from
- `page_number` (int, optional): Page number to fetch (default: 1)
- `page_size` (int, optional): Number of results per page (default: 20, max: 100)
- `q` (string, optional): Search query to filter projects by name
- `filter_names` (string, optional): Filter projects by name (comma-separated list)
- `filter_permissions_update` (bool, optional): Filter projects that the user can update
- `filter_permissions_create_workspace` (bool, optional): Filter projects that the user can create workspaces in
- `sort` (string, optional): Sort projects by name ('name' or '-name' for descending)
**Used by:**
- `list_projects` tool function to validate listing parameters
### BaseProjectRequest
**Type:** Request Validation Model
**Description:** Base request model containing common fields for project operations.
**Fields:**
- Includes all fields from ProjectParams
### ProjectCreateRequest
**Type:** Request Validation Model
**Description:** Used to validate parameters for creating a project.
**Fields:**
- `organization` (string, required): The organization name
- `name` (string, required): The name for the new project
- `params` (ProjectParams, optional): Additional configuration options
**Used by:**
- `create_project` tool function to validate project creation parameters
### ProjectUpdateRequest
**Type:** Request Validation Model
**Description:** Used to validate parameters for updating a project.
**Fields:**
- `project_id` (string, required): The ID of the project to update
- `params` (ProjectParams, optional): Settings to update
**Used by:**
- `update_project` tool function to validate project update parameters
### ProjectTagBindingRequest
**Type:** Request Validation Model
**Description:** Used to validate tag binding operations for projects.
**Fields:**
- `project_id` (string, required): The ID of the project
**Used by:**
- `list_project_tag_bindings` tool function to validate project ID
### WorkspaceMoveRequest
**Type:** Request Validation Model
**Description:** Used to validate workspace move operations.
**Fields:**
- `project_id` (string, required): The ID of the destination project
- `workspace_ids` (List[string], required): List of workspace IDs to move
**Used by:**
- `move_workspaces_to_project` tool function to validate move parameters
## API Response Structure
### Project Details Response
```json
{
  "data": {
    "id": "prj-AbCdEfGhIjKlMnOp",
    "type": "projects",
    "attributes": {
      "name": "Production Infrastructure",
      "description": "Production environment resources",
      "auto-destroy-activity-duration": "14d",
      "created-at": "2023-05-15T10:30:00Z", 
      "permissions": {
        "can-destroy": true,
        "can-update": true,
        "can-access-settings": true,
        "can-create-workspace": true,
        "can-move-workspaces": true
      }
    },
    "relationships": {
      "organization": {
        "data": {
          "id": "org-AbCdEfGh",
          "type": "organizations"
        }
      }
    }
  }
}
```
### Tag Bindings Response
```json
{
  "data": [
    {
      "id": "tag-AbCdEfGh",
      "type": "tag-bindings",
      "attributes": {
        "key": "environment",
        "value": "production",
        "created-at": "2023-05-15T10:35:00Z"
      }
    },
    {
      "id": "tag-IjKlMnOp",
      "type": "tag-bindings",
      "attributes": {
        "key": "team",
        "value": "platform",
        "created-at": "2023-05-15T10:35:00Z"
      }
    }
  ]
}
```
### List Projects Response
```json
{
  "data": [
    {
      "id": "prj-AbCdEfGh",
      "type": "projects",
      "attributes": {
        "name": "Production",
        "description": "Production infrastructure",
        "created-at": "2023-05-15T10:30:00Z",
        "workspaces-count": 12
      }
    },
    {
      "id": "prj-IjKlMnOp",
      "type": "projects",
      "attributes": {
        "name": "Development",
        "description": "Development infrastructure",
        "created-at": "2023-05-16T14:20:00Z",
        "workspaces-count": 8
      }
    }
  ],
  "meta": {
    "pagination": {
      "current-page": 1,
      "page-size": 20,
      "prev-page": null,
      "next-page": null,
      "total-pages": 1,
      "total-count": 2
    }
  }
}
```
## Related Resources
- [Project Tools](../tools/project.md)
- [Organization Models](organization.md)
- [Workspace Models](workspace.md)
- [Terraform Cloud API - Projects](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/projects)
```
--------------------------------------------------------------------------------
/terraform_cloud_mcp/models/state_versions.py:
--------------------------------------------------------------------------------
```python
"""State version models for Terraform Cloud API
This module contains models for Terraform Cloud state version-related requests.
Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/state-versions
"""
from enum import Enum
from typing import Optional
from pydantic import Field
from .base import APIRequest
class StateVersionStatus(str, Enum):
    """Status options for state versions in Terraform Cloud.
    Defines the various states a state version can be in during its lifecycle:
    - PENDING: State version has been created but state data is not encoded within the request
    - FINALIZED: State version has been successfully uploaded or created with valid state attribute
    - DISCARDED: State version was discarded because it was superseded by a newer version
    - BACKING_DATA_SOFT_DELETED: Enterprise only - backing files are marked for garbage collection
    - BACKING_DATA_PERMANENTLY_DELETED: Enterprise only - backing files have been permanently deleted
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/state-versions#state-version-status
    See:
        docs/models/state_versions.md for reference
    """
    PENDING = "pending"
    FINALIZED = "finalized"
    DISCARDED = "discarded"
    BACKING_DATA_SOFT_DELETED = "backing_data_soft_deleted"
    BACKING_DATA_PERMANENTLY_DELETED = "backing_data_permanently_deleted"
class StateVersionListRequest(APIRequest):
    """Request parameters for listing state versions.
    Defines the parameters for the state version listing API including pagination
    and filtering options.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/state-versions#list-state-versions
    See:
        docs/models/state_versions.md for reference
    """
    filter_workspace_name: Optional[str] = Field(
        None,
        description="Filter by workspace name",
    )
    filter_organization_name: Optional[str] = Field(
        None,
        description="Filter by organization name",
    )
    filter_status: Optional[StateVersionStatus] = Field(
        None,
        description="Filter state versions by status",
    )
    page_number: Optional[int] = Field(
        1,
        ge=1,
        description="Page number to fetch",
    )
    page_size: Optional[int] = Field(
        20,
        ge=1,
        le=100,
        description="Number of results per page",
    )
class StateVersionRequest(APIRequest):
    """Request model for retrieving a state version.
    Used to validate the state version ID parameter for API requests.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/state-versions
    See:
        docs/models/state_versions.md for reference
    """
    state_version_id: str = Field(
        ...,
        description="The ID of the state version to retrieve",
        pattern=r"^sv-[a-zA-Z0-9]{16}$",  # Standard state version ID pattern
    )
class CurrentStateVersionRequest(APIRequest):
    """Request model for retrieving a workspace's current state version.
    Used to validate the workspace ID parameter for current state version API requests.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/state-versions#get-current-state-version
    See:
        docs/models/state_versions.md for reference
    """
    workspace_id: str = Field(
        ...,
        description="The ID of the workspace to retrieve the current state version for",
        pattern=r"^ws-[a-zA-Z0-9]{16}$",  # Standard workspace ID pattern
    )
class StateVersionCreateRequest(APIRequest):
    """Request model for creating a state version.
    Validates and structures the request according to the Terraform Cloud API
    requirements for creating state versions.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/state-versions#create-a-state-version
    See:
        docs/models/state_versions.md for reference
    """
    workspace_id: str = Field(
        ...,
        description="The ID of the workspace to create a state version in",
        pattern=r"^ws-[a-zA-Z0-9]{16}$",  # Standard workspace ID pattern
    )
    serial: int = Field(
        ...,
        description="The serial of the state version",
        ge=0,
    )
    md5: str = Field(
        ...,
        description="An MD5 hash of the raw state version",
        pattern=r"^[a-fA-F0-9]{32}$",  # MD5 hash pattern
    )
    state: Optional[str] = Field(
        None,
        description="Base64 encoded raw state file",
    )
    lineage: Optional[str] = Field(
        None,
        description="Lineage of the state version",
    )
    json_state: Optional[str] = Field(
        None,
        alias="json-state",
        description='Base64 encoded json state, as expressed by "terraform show -json"',
    )
    json_state_outputs: Optional[str] = Field(
        None,
        alias="json-state-outputs",
        description='Base64 encoded output values as represented by "terraform show -json"',
    )
    run_id: Optional[str] = Field(
        None,
        description="The ID of the run to associate with the state version",
        pattern=r"^run-[a-zA-Z0-9]{16}$",  # Standard run ID pattern
    )
class StateVersionParams(APIRequest):
    """Parameters for state version operations without routing fields.
    This model provides all optional parameters for creating state versions,
    reusing field definitions from StateVersionCreateRequest. It separates configuration
    parameters from routing information.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/state-versions
    See:
        docs/models/state_versions.md for reference
    """
    serial: Optional[int] = Field(
        None,
        description="The serial of the state version",
        ge=0,
    )
    md5: Optional[str] = Field(
        None,
        description="An MD5 hash of the raw state version",
        pattern=r"^[a-fA-F0-9]{32}$",  # MD5 hash pattern
    )
    state: Optional[str] = Field(
        None,
        description="Base64 encoded raw state file",
    )
    lineage: Optional[str] = Field(
        None,
        description="Lineage of the state version",
    )
    json_state: Optional[str] = Field(
        None,
        alias="json-state",
        description='Base64 encoded json state, as expressed by "terraform show -json"',
    )
    json_state_outputs: Optional[str] = Field(
        None,
        alias="json-state-outputs",
        description='Base64 encoded output values as represented by "terraform show -json"',
    )
    run_id: Optional[str] = Field(
        None,
        description="The ID of the run to associate with the state version",
        pattern=r"^run-[a-zA-Z0-9]{16}$",  # Standard run ID pattern
    )
# Response handling is implemented through raw dictionaries
```
--------------------------------------------------------------------------------
/docs/conversations/cost-estimate-conversation.md:
--------------------------------------------------------------------------------
```markdown
# Cost Estimate Management Conversation
This file demonstrates Claude's ability to interact with the Terraform Cloud MCP to retrieve and analyze cost estimate information.
## Getting a Cost Estimate by ID
```
**User:** Can you check the cost estimate for my latest infrastructure changes? The cost estimate ID is ce-BPvFFrYCqRV6qVBK.
**Claude:** I'll retrieve the cost estimate details for you right away.
[Claude uses get_cost_estimate_details tool]
Here's the summary of your cost estimate:
- **Status**: Finished
- **Prior Monthly Cost**: $0.00
- **Proposed Monthly Cost**: $25.49
- **Change in Monthly Cost**: +$25.49 (100% increase)
- **Resources Analyzed**: 4 total (3 matched, 1 unmatched)
The estimate shows that your infrastructure changes will add approximately $25.49 to your monthly costs. This is from new resources being added since your prior cost was $0.00.
```
## Finding a Cost Estimate from a Run
```
**User:** How do I find the cost estimate for a specific run? I have run ID run-pQtfN5XSe8zaJ2UF.
**Claude:** I'll show you how to find the cost estimate for your specific run ID.
[Claude uses get_run_details tool to retrieve run information]
I can see from the run details that there's a cost estimate associated with this run. Let me get the detailed cost information for you.
[Claude uses get_cost_estimate_details tool]
I found the cost estimate for your run. Here are the details:
- **Cost Estimate ID**: ce-8cZKNpyP3R2fBaXD
- **Status**: Finished
- **Prior Monthly Cost**: $127.32
- **Proposed Monthly Cost**: $158.76
- **Change in Monthly Cost**: +$31.44 (24.7% increase)
- **Resources Analyzed**: 18 total (16 matched, 2 unmatched)
Your run will increase costs by $31.44 per month. The two unmatched resources are likely new resources being added that don't have previous cost data.
```
## Understanding Cost Estimate Error
```
**User:** I tried to look at a cost estimate but got an error. The ID was ce-ZXcY12AbPq9LkMnR.
**Claude:** Let me check what might be causing the error with that cost estimate ID.
[Claude uses get_cost_estimate_details tool]
I checked the cost estimate with ID ce-ZXcY12AbPq9LkMnR and found the following issue:
The cost estimate appears to have an error status. Here are the details:
- **Status**: Errored
- **Error Message**: "Failed to calculate costs: Provider AWS returned error: Resource pricing data unavailable for region eu-west-3"
- **Timestamp**: Started at 2023-06-15T14:23:18Z, errored at 2023-06-15T14:23:45Z
This error is occurring because the cost estimation service doesn't have pricing data for resources in the eu-west-3 AWS region. This typically happens when:
1. You're using a newer AWS region that isn't fully integrated with the Terraform Cloud cost estimation service yet
2. You're using resources that don't have standard pricing data available
Options to resolve this:
1. Consider using an AWS region that has full cost estimation support, like us-east-1, us-west-2, or eu-west-1
2. You can still proceed with your infrastructure changes, but you won't get accurate cost estimates
3. Contact Terraform Cloud support if you need cost estimation support for this specific region
```
## Comparing Multiple Cost Estimates
```
**User:** I want to compare the cost estimates from my last two runs to see which one is more cost-effective.
**Claude:** I'll help you compare the cost estimates from your last two runs.
[Claude uses get_run_details tool for each run to extract cost estimate IDs]
[Claude uses get_cost_estimate_details tool for each cost estimate]
Here's a comparison of the cost estimates from your last two runs:
| Run | Message | Prior Cost | Proposed Cost | Change | Resources |
|-----|---------|------------|---------------|--------|-----------|
| run-pQtfN5XSe8zaJ2UF | Add Redis cluster | $127.32 | $158.76 | +$31.44 | 18 |
| run-aB3cD4eF5gH6iJ7k | Optimize compute resources | $158.76 | $142.18 | -$16.58 | 18 |
Based on this comparison:
1. The first run adds a Redis cluster, increasing your monthly costs by $31.44
2. The second run optimizes your compute resources, decreasing your monthly costs by $16.58
The second run is more cost-effective, saving you $16.58 per month compared to the first run. However, note that the runs have different purposes - one adds functionality while the other optimizes costs. The most cost-effective choice depends on whether you need the Redis cluster functionality.
```
```
**User:** That's helpful! How can I get more detailed information about which resources contribute the most to my costs?
**Claude:** To get more detailed information about which resources contribute most to your costs, I'll analyze the detailed breakdown from your cost estimate.
[Claude uses get_cost_estimate_details tool to retrieve detailed resource cost information]
Based on the analysis of your cost estimate, here are the key insights:
### Top Resource Types by Cost
1. **aws_instance**: $72.34/month (45.6% of total costs)
2. **aws_rds_cluster**: $45.60/month (28.7% of total costs) 
3. **aws_elasticache_cluster**: $31.44/month (19.8% of total costs)
4. **aws_s3_bucket**: $5.62/month (3.5% of total costs)
5. **aws_ebs_volume**: $3.76/month (2.4% of total costs)
### Top Individual Resources
1. **aws_rds_cluster.primary**: $45.60/month
   - Main database cluster
   - Components: Instance costs, storage costs, backup costs
   
2. **aws_instance.web_servers[4]**: $14.47/month
   - Web server instance #4 (t3.large)
   - Components: Compute hours, storage
   
3. **aws_instance.web_servers[3]**: $14.47/month
   - Web server instance #3 (t3.large)
   - Components: Compute hours, storage
   
4. **aws_elasticache_cluster.redis_primary**: $31.44/month
   - Redis cluster (cache.m5.large)
   - Components: Node hours, memory allocation
   
5. **aws_instance.application_server**: $14.60/month
   - Application server (t3.large)
   - Components: Compute hours, storage
### Cost Optimization Recommendations
Based on this cost breakdown, here are some opportunities to reduce costs:
1. **Right-size your instances**: Some of your web servers might be over-provisioned. Consider using t3.medium instead of t3.large for less critical workloads.
2. **Use reserved instances**: Your EC2 instances account for nearly half of your costs. Consider purchasing reserved instances for stable workloads to save 30-60% on your EC2 costs.
3. **Optimize your Redis usage**: Your Redis cluster represents a significant cost. Consider whether you need the current size or if a smaller instance would suffice.
4. **Review your RDS cluster**: Your RDS cluster is your second-highest cost. Review if you need all the provisioned resources or if you could use a smaller instance type.
5. **S3 lifecycle policies**: For your S3 buckets, implement lifecycle policies to transition infrequently accessed data to cheaper storage classes.
Would you like me to dive deeper into any specific resource to find more cost-saving opportunities?
```
```
--------------------------------------------------------------------------------
/docs/conversations/plan-management-conversation.md:
--------------------------------------------------------------------------------
```markdown
# Plan Management Conversation Examples
This document provides example conversations demonstrating how to use the plan management tools in the Terraform Cloud MCP.
## Getting Plan Details
```
**User:** I'd like to see details for my plan with ID plan-aBcdEfGh12345678
**Claude:** I'll retrieve the details for that plan for you.
[Claude uses get_plan_details tool]
Here are the details for plan-aBcdEfGh12345678:
Status: finished
Resources to add: 3
Resources to change: 1  
Resources to destroy: 0
The plan was queued at 2025-02-15T14:30:22Z, started at 2025-02-15T14:31:05Z, and finished at 2025-02-15T14:32:18Z.
It's associated with run-pQrStUvW12345678 in workspace "production-infrastructure".
```
## Retrieving Plan Logs
```
**User:** Can you show me the actual Terraform plan output for plan-aBcdEfGh12345678?
**Claude:** I'll retrieve the plan logs for you.
[Claude uses get_plan_logs tool]
I've retrieved the complete plan execution logs. The plan shows that Terraform will create an S3 bucket called "example-data-bucket" with versioning enabled and production tags, create a new IAM role for Lambda execution, and modify an existing security group to add an HTTPS ingress rule allowing traffic from anywhere.
The plan execution used Terraform v1.5.7 and successfully determined that 2 resources will be added and 1 will be changed, with no resources being destroyed.
Would you like me to explain any part of this plan in more detail?
```
## Retrieving Plan JSON Output
```
**User:** Can you get me the JSON execution output for plan-aBcdEfGh12345678? I need to analyze the changes.
**Claude:** I'll retrieve the JSON execution plan for you.
[Claude uses get_plan_json_output tool]
I've obtained the JSON execution plan. Here's an analysis of the planned changes:
The plan includes:
- 2 resources to be created (aws_s3_bucket.data and aws_iam_role.lambda_role)
- 1 resource to be modified (aws_security_group.allow_http)
- 0 resources to be destroyed
Key modifications:
- The security group is having a new ingress rule added for HTTPS (port 443)
- The new S3 bucket will have versioning enabled
- The IAM role has permissions for Lambda and S3 access
The plan was generated with Terraform 1.5.7 and includes all configuration details and state information. Would you like me to extract any specific part of the plan for more detailed analysis?
```
## Retrieving Run Plan JSON Output
```
**User:** I'm working with a run ID run-xYzAbCdE12345678 and need the JSON execution plan from that run.
**Claude:** I'll get the JSON plan output from that run for you.
[Claude uses get_run_plan_json_output tool]
I've retrieved the JSON execution plan for run-xYzAbCdE12345678. This is a destruction plan that will remove the following resources:
- aws_instance.web_server (EC2 instance)
- aws_security_group.web_access
- aws_s3_bucket.logs
- aws_cloudwatch_metric_alarm.high_cpu
The plan was generated with Terraform version 1.6.2. All four resources will be completely removed, and the state will be updated accordingly. 
The plan shows that this will remove:
- 1 EC2 t2.micro instance in us-west-2
- 1 security group with 3 ingress rules
- 1 S3 bucket containing approximately 250MB of log data
- 1 CloudWatch alarm that was monitoring the EC2 instance
Would you like me to extract any specific details from the plan?
```
## Working with Plan Status
```
**User:** My plan seems to be taking a long time. The ID is plan-mNoPqRsT12345678. What's its status?
**Claude:** Let me check the status of that plan for you.
[Claude uses get_plan_details tool]
Your plan with ID plan-mNoPqRsT12345678 is currently in the "running" status. 
It was queued at 2025-03-01T09:15:33Z and started execution at 2025-03-01T09:16:10Z (about 25 minutes ago).
This plan is running in an agent execution mode on agent "prod-deployment-agent-05" in the "Production" agent pool. Complex plans with many resources can take time to complete, especially if there are many computed values or external data sources.
Would you like me to check if there are any logs available that might provide more information about the progress?
```
## Analyzing Plan Resources
```
**User:** I need to understand what resources will be created in plan-gHiJkLmN12345678
**Claude:** I'll retrieve and analyze the plan details for you.
[Claude uses get_plan_details tool]
The plan plan-gHiJkLmN12345678 will create 5 new resources:
- 2 aws_security_group resources
- 1 aws_instance (EC2) resource
- 1 aws_s3_bucket resource
- 1 aws_iam_role resource
The plan will also modify 2 existing resources and destroy 1 resource.
This plan was created as part of run-eFgHiJkL12345678 in workspace "network-infrastructure" and has already finished execution. It's ready to be applied.
For more detailed analysis of the exact resource configurations, I can retrieve the JSON execution plan which contains the full definition of all resource changes.
```
## Comparing Multiple Plans
```
**User:** I have two plans I want to compare: plan-qRsTuVwX12345678 and plan-yZaBcDeF12345678
**Claude:** I'll retrieve both plans and compare them for you.
[Claude uses get_plan_details tool]
Here's a comparison of the two plans:
Plan 1 (plan-qRsTuVwX12345678):
- Status: finished
- Resources to add: 4
- Resources to change: 2
- Resources to destroy: 1
- Execution time: 1m 45s (2025-03-10)
- Run: run-aBcDeFgH12345678
Plan 2 (plan-yZaBcDeF12345678):
- Status: finished
- Resources to add: 4
- Resources to change: 3
- Resources to destroy: 0
- Execution time: 2m 12s (2025-03-15)
- Run: run-iJkLmNoP12345678
The second plan (from March 15) modifies one additional resource compared to the first plan (from March 10), but doesn't destroy any resources. Both plans add the same number of new resources.
To see exactly what changed between these plans, I can retrieve and compare the JSON output for both.
```
## Examining Plan Logs for Errors
```
**User:** I'm concerned about potential issues in plan-kLmNoPqR12345678. Can you check the logs for any warnings or errors?
**Claude:** I'll examine the plan logs for warnings or errors.
[Claude uses get_plan_logs tool]
I've analyzed the plan logs and found the following issues:
Warnings:
1. Deprecated attribute: The plan contains a warning about using the deprecated "public_ip" attribute on aws_instance.web. The recommended replacement is "public_ip_address".
2. Provider version constraint: There's a warning that the azurerm provider being used (3.0.1) is older than the recommended minimum version (3.2.0).
Potential errors:
1. Data source dependency: The plan shows that aws_vpc.main will be created before aws_subnet.primary, but there's a reference to the VPC ID in the subnet. This could cause issues if the VPC creation fails.
2. Resource validation warning: An S3 bucket name "my-bucket" might not be globally unique, which could cause the apply to fail.
The plan itself completed successfully but these issues might cause problems during apply. Would you like me to give recommendations on how to address these concerns?
```
```
--------------------------------------------------------------------------------
/terraform_cloud_mcp/models/projects.py:
--------------------------------------------------------------------------------
```python
"""Project models for Terraform Cloud API
This module contains models for Terraform Cloud project-related requests.
Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/projects
"""
from typing import List, Optional
from pydantic import Field
from .base import APIRequest
class TagBinding(APIRequest):
    """Tag binding configuration for a project.
    Defines a tag key-value pair that can be bound to a project
    and inherited by its workspaces.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/projects
    See:
        docs/models/project.md for reference
    """
    # Inherits model_config from APIRequest -> BaseModelConfig
    key: str = Field(..., description="The key of the tag")
    value: str = Field(..., description="The value of the tag")
class ProjectListRequest(APIRequest):
    """Request parameters for listing projects in an organization.
    Defines the parameters for the project listing API including pagination
    and search filtering options.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/projects#list-projects
    See:
        docs/models/project.md for reference
    """
    organization: str = Field(
        ...,
        description="The name of the organization to list projects from",
        min_length=3,
        pattern=r"^[a-z0-9][-a-z0-9_]*[a-z0-9]$",
    )
    page_number: Optional[int] = Field(1, ge=1, description="Page number to fetch")
    page_size: Optional[int] = Field(
        20, ge=1, le=100, description="Number of results per page"
    )
    q: Optional[str] = Field(None, description="Search query for name")
    filter_names: Optional[str] = Field(
        None, description="Filter projects by name (comma-separated)"
    )
    filter_permissions_update: Optional[bool] = Field(
        None, description="Filter projects by update permission"
    )
    filter_permissions_create_workspace: Optional[bool] = Field(
        None, description="Filter projects by create workspace permission"
    )
    sort: Optional[str] = Field(
        None, description="Sort projects by name ('name' or '-name' for descending)"
    )
class BaseProjectRequest(APIRequest):
    """Base class for project create and update requests with common fields.
    This includes common fields used in request payloads for project
    creation and update APIs, providing a foundation for more specific project models.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/projects
    Note:
        This class inherits model_config from APIRequest -> BaseModelConfig and provides
        default values for fields based on Terraform Cloud API defaults.
    See:
        docs/models/project.md for detailed field descriptions and usage examples
    """
    # Fields common to both create and update requests
    name: Optional[str] = Field(
        None,
        description="Name of the project",
    )
    description: Optional[str] = Field(
        None,
        description="Description of the project",
    )
    auto_destroy_activity_duration: Optional[str] = Field(
        None,
        alias="auto-destroy-activity-duration",
        description="How long each workspace should wait before auto-destroying (e.g., '14d', '24h')",
    )
    tag_bindings: Optional[List[TagBinding]] = Field(
        None,
        alias="tag-bindings",
        description="Tags to bind to the project, inherited by workspaces",
    )
class ProjectCreateRequest(BaseProjectRequest):
    """Request model for creating a Terraform Cloud project.
    Validates and structures the request according to the Terraform Cloud API
    requirements for creating projects. Extends BaseProjectRequest with
    required fields for creation.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/projects#create-a-project
    Note:
        This inherits all configuration fields from BaseProjectRequest
        while making organization and name required.
    See:
        docs/models/project.md for reference
    """
    # Organization is needed for routing but not included in the payload
    organization: str = Field(
        ...,
        description="The name of the organization to create the project in",
    )
    # Override name to make it required for creation
    name: str = Field(
        ...,
        description="Name of the project",
    )
class ProjectUpdateRequest(BaseProjectRequest):
    """Request model for updating a Terraform Cloud project.
    Validates and structures the request for updating projects. Extends BaseProjectRequest
    with routing fields while keeping all configuration fields optional.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/projects#update-a-project
    Note:
        This inherits all configuration fields from BaseProjectRequest
        and adds required routing field for the update operation.
    See:
        docs/models/project.md for reference
    """
    # Add project_id which is required for updates but not part of the project attributes payload
    project_id: str = Field(
        ...,
        description="The ID of the project to update",
    )
class ProjectParams(BaseProjectRequest):
    """Parameters for project operations without routing fields.
    This model provides all optional parameters for creating or updating projects,
    reusing field definitions from BaseProjectRequest. It separates configuration
    parameters from routing information like organization and project ID.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/projects
    Note:
        When updating a project, use this model to specify only the attributes
        you want to change. Unspecified attributes retain their current values.
        All fields are inherited from BaseProjectRequest.
    See:
        docs/models/project.md for reference
    """
    # Inherits model_config and all fields from BaseProjectRequest
class ProjectTagBindingRequest(APIRequest):
    """Request model for adding or updating tag bindings on a project.
    This model is used for the PATCH /projects/{project_id}/tag-bindings endpoint,
    which allows adding or updating tag bindings on an existing project.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/projects#add-or-update-tag-bindings-on-a-project
    See:
        docs/models/project.md for reference
    """
    project_id: str = Field(
        ...,
        description="The ID of the project to update tag bindings for",
    )
    tag_bindings: List[TagBinding] = Field(
        ...,
        description="Tags to bind to the project",
    )
class WorkspaceMoveRequest(APIRequest):
    """Request model for moving workspaces into a project.
    This model is used for the POST /projects/{project_id}/relationships/workspaces endpoint,
    which allows moving one or more workspaces into a project.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/projects#move-workspaces-into-a-project
    See:
        docs/models/project.md for reference
    """
    project_id: str = Field(
        ...,
        description="The ID of the destination project",
    )
    workspace_ids: List[str] = Field(
        ...,
        description="The IDs of workspaces to move into the project",
    )
```
--------------------------------------------------------------------------------
/docs/tools/workspace.md:
--------------------------------------------------------------------------------
```markdown
# Workspace Tools
This module provides tools for managing workspaces in Terraform Cloud.
## Overview
Workspaces in Terraform Cloud are isolated environments for managing infrastructure, containing Terraform configurations, state files, variables, and run histories. These tools allow you to create, read, update, delete, lock, and unlock workspaces.
## API Reference
These tools interact with the Terraform Cloud Workspaces API:
- [Workspaces API Documentation](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/workspaces)
- [Workspace Settings](https://developer.hashicorp.com/terraform/cloud-docs/workspaces/settings)
## Tools Reference
### list_workspaces
**Function:** `list_workspaces(organization: str, page_number: int = 1, page_size: int = 20, search: Optional[str] = None) -> Dict[str, Any]`
**Description:** Retrieves a paginated list of all workspaces in an organization.
**Parameters:**
- `organization` (str): The name of the organization to list workspaces from
- `page_number` (int, optional): The page number to return (default: 1)
- `page_size` (int, optional): Number of items per page (default: 20, max: 100)
- `search` (str, optional): Filter workspaces by name
**Returns:** JSON response containing paginated list of workspaces with their configuration and metadata.
**Notes:**
- Requires "read workspaces" permission on the organization
- Use the search parameter to find workspaces by partial name match
- Results are paginated, with metadata indicating total count and links to additional pages
### get_workspace_details
**Function:** `get_workspace_details(workspace_id: str = "", organization: str = "", workspace_name: str = "") -> Dict[str, Any]`
**Description:** Gets detailed information about a specific workspace, identified by ID or by organization and workspace name.
**Parameters:**
- `workspace_id` (str, optional): The ID of the workspace (format: "ws-xxxxxxxx")
- `organization` (str, optional): The organization name (required if workspace_id not provided)
- `workspace_name` (str, optional): The workspace name (required if workspace_id not provided)
**Returns:** JSON response with comprehensive workspace details including configuration and current status.
**Notes:**
- You can identify the workspace either by ID directly, or by organization+name combination
- Requires "read workspaces" permission on the organization or workspace
### create_workspace
**Function:** `create_workspace(organization: str, name: str, params: Optional[WorkspaceParams] = None) -> Dict[str, Any]`
**Description:** Creates a new workspace in the specified organization.
**Parameters:**
- `organization` (str): The organization name
- `name` (str): The name for the new workspace
- `params` (WorkspaceParams, optional): Additional configuration options:
  - `description`: Human-readable description
  - `execution_mode`: How operations are executed (remote, local, agent)
  - `terraform_version`: Version of Terraform to use
  - `working_directory`: Subdirectory for Terraform configuration
  - `vcs_repo`: Version control repository configuration
  - `auto_apply`: Whether to auto-apply successful plans
  - And many other options...
**Returns:** JSON response with the created workspace details.
**Notes:**
- Requires "create workspaces" permission on the organization
- Workspace names must be unique within an organization
- Default execution mode is "remote" unless otherwise specified
### update_workspace
**Function:** `update_workspace(organization: str, workspace_name: str, params: Optional[WorkspaceParams] = None) -> Dict[str, Any]`
**Description:** Updates an existing workspace's settings.
**Parameters:**
- `organization` (str): The organization name
- `workspace_name` (str): The current workspace name
- `params` (WorkspaceParams, optional): Settings to update, including:
  - `name`: New name for the workspace (if renaming)
  - `description`: Human-readable description
  - And all other options available in create_workspace...
**Returns:** JSON response with the updated workspace details.
**Notes:**
- Requires "update workspace settings" permission
- Only specified attributes will be updated; unspecified attributes remain unchanged
- To rename a workspace, include the new name in the params
### delete_workspace
**Function:** `delete_workspace(organization: str, workspace_name: str) -> Dict[str, Any]`
**Description:** Permanently deletes a workspace and all related resources.
**Parameters:**
- `organization` (str): The organization name
- `workspace_name` (str): The workspace name to delete
**Returns:** Success message with no content (HTTP 204) if successful.
**Notes:**
- Requires "delete workspaces" permission
- This is a destructive operation that cannot be undone
- Will delete all state versions, run history, and configuration versions
- Use safe_delete_workspace if you want to check for resources first
### safe_delete_workspace
**Function:** `safe_delete_workspace(organization: str, workspace_name: str) -> Dict[str, Any]`
**Description:** Safely deletes a workspace after checking if it has resources.
**Parameters:**
- `organization` (str): The organization name
- `workspace_name` (str): The workspace name to delete
**Returns:** Success message or error if the workspace has resources.
**Notes:**
- Requires "delete workspaces" permission
- Prevents accidental deletion of workspaces with active infrastructure
- Will fail if the workspace has any resources
### lock_workspace
**Function:** `lock_workspace(workspace_id: str, reason: str = "") -> Dict[str, Any]`
**Description:** Locks a workspace to prevent runs from being queued.
**Parameters:**
- `workspace_id` (str): The ID of the workspace to lock
- `reason` (str, optional): Reason for locking the workspace
**Returns:** JSON response with updated workspace including lock information.
**Notes:**
- Requires "lock workspaces" permission
- Locking prevents new runs but doesn't affect already running plans or applies
- Useful during maintenance or when making manual changes
### unlock_workspace
**Function:** `unlock_workspace(workspace_id: str) -> Dict[str, Any]`
**Description:** Unlocks a previously locked workspace.
**Parameters:**
- `workspace_id` (str): The ID of the workspace to unlock
**Returns:** JSON response with updated workspace showing unlocked status.
**Notes:**
- Requires "unlock workspaces" permission
- Can only unlock workspaces that you locked (unless you have admin rights)
### force_unlock_workspace
**Function:** `force_unlock_workspace(workspace_id: str) -> Dict[str, Any]`
**Description:** Force unlocks a workspace even if locked by another user.
**Parameters:**
- `workspace_id` (str): The ID of the workspace to force unlock
**Returns:** JSON response with updated workspace showing unlocked status.
**Notes:**
- Requires admin privileges on the workspace
- Use with caution - should only be used when the normal unlock process isn't possible
- Typically needed when a run has orphaned a lock or the user who locked is unavailable
**Common Error Scenarios:**
| Error | Cause | Solution |
|-------|-------|----------|
| 404 | Workspace not found | Verify the workspace ID or name exists |
| 403 | Insufficient permissions | Ensure you have the proper permissions |
| 422 | Validation error | Ensure workspace name follows conventions (only lowercase letters, numbers, hyphens, underscores) |
| 409 | Conflict | Workspace name already exists in organization |
| 423 | Workspace is locked | Unlock the workspace first or use force_unlock_workspace |
```
--------------------------------------------------------------------------------
/docs/conversations/organizations-management-conversation.md:
--------------------------------------------------------------------------------
```markdown
# Organization Management - Example Conversations
This document shows example conversations with Claude using the Terraform Cloud MCP organization management features.
## Listing and Finding Organizations
```
**User**: List all my Terraform Cloud organizations
**Claude**: I'll list all the Terraform Cloud organizations you have access to.
[Claude uses list_organizations tool]
Here are all the Terraform Cloud organizations you have access to:
1. **example-org**
   - Email: [email protected]
   - Created: January 15, 2023
   - Plan: Team & Governance
2. **terraform-demo**
   - Email: [email protected]
   - Created: March 3, 2023
   - Plan: Free
3. **acme-corp**
   - Email: [email protected]
   - Created: June 12, 2023
   - Plan: Business
4. **infrastructure-team**
   - Email: [email protected]
   - Created: August 8, 2023
   - Plan: Free
5. **test-org**
   - Email: [email protected]
   - Created: December 1, 2023
   - Plan: Free
```
```
**User**: Show me details for my "example-org" organization
**Claude**: I'll retrieve detailed information about the "example-org" organization.
[Claude uses get_organization_details tool]
Here are the details for the "example-org" organization:
**Basic Information:**
- Name: example-org
- ID: org-WV6DfwfxxXvLfvfs
- External ID: org-WV6DfwfxxXvLfvfs
- Created: January 15, 2023
- Email: [email protected]
**Plan Details:**
- Plan Type: Team & Governance
- Plan Expired: No
- Trial Status: Not on trial
- Enterprise Plan: No
**Settings:**
- Default Execution Mode: Remote
- Collaborator Auth Policy: Password
- Cost Estimation Enabled: Yes
- Run Queuing Enabled: Yes
- VCS Status Updates: Aggregated (commit status enabled)
- SAML Enabled: No
**Permissions:**
- You can manage users, teams, and workspaces
- You can manage VCS connections
- You can create and manage modules
- You can create and manage policy sets
- You can manage subscription settings
This organization is on the Team & Governance tier with active cost estimation features and standard remote execution setup.
```
```
**User**: Get information about "example-org" organization
**Claude**: I'll retrieve the details for the "example-org" organization.
[Claude uses get_organization_details tool]
Here are the details for the "example-org" organization:
**Basic Information:**
- Name: example-org
- ID: org-WV6DfwfxxXvLfvfs
- External ID: org-WV6DfwfxxXvLfvfs
- Created: January 15, 2023
- Email: [email protected]
- Plan Type: Team & Governance
The organization is on the Team & Governance tier. If you'd like to see the organization's entitlements and feature limits, I can retrieve that information separately.
```
```
**User**: What are the entitlements for my "example-org" organization?
**Claude**: I'll specifically check the entitlements for your "example-org" organization.
[Claude uses get_organization_entitlements tool]
Here are the detailed entitlements for your "example-org" organization:
**Features:**
- Operations: Yes (remote Terraform operations)
- Private Module Registry: Yes
- Sentinel Policies: Yes
- Teams: Yes
- Cost Estimation: Yes
- VCS Integrations: Yes
- State Storage: Yes
- Self-Serve Billing: Yes
- Configuration Designer: Yes
- Run Tasks: Yes
**Enterprise Features (Not Available):**
- SSO: No
- Audit Logging: No
- Agents: No
- Private Policy Agents: No
**Limits:**
- User Limit: 25
- Team Limit: Unlimited
- Policy Limit: 15
- Policy Set Limit: 20
- Run Task Limit: 10
- Run Task Workspace Limit: 100
- Policy Mandatory Enforcement Limit: 5
Your organization has the Team & Governance tier features, including policy as code (Sentinel), cost estimation, and team management. You don't currently have access to enterprise features like SSO or audit logging. Your organization can have up to 25 users and has reasonable limits for policies and run tasks.
```
```
**User**: Find organizations with "prod" in their name
**Claude**: I'll search for organizations with "prod" in their name.
[Claude uses list_organizations tool with name filter]
I found the following organizations with "prod" in their name:
1. **production-infra**
   - Email: [email protected]
   - Created: February 10, 2023
   - Plan: Business
2. **acme-prod**
   - Email: [email protected]
   - Created: July 7, 2023
   - Plan: Team & Governance
3. **production-systems**
   - Email: [email protected]
   - Created: October 15, 2023
   - Plan: Free
These three organizations match your search query for "prod" in their names. Each has different plan levels and was created at different times.
```
## Creating and Managing Organizations
```
**User**: Create a new organization named "terraform-demo" with my email "[email protected]"
**Claude**: I'll create a new organization named "terraform-demo" with your email.
[Claude uses create_organization tool]
I've successfully created a new organization in Terraform Cloud. Here are the details:
**Organization Created:**
- Name: terraform-demo
- Email: [email protected]
- ID: org-Bzyc2JuegvVLAibn
- Creation Date: Just now
**Default Configuration:**
- Execution Mode: Remote
- Authentication Policy: Password
- Cost Estimation: Disabled
- Session Timeout: 20160 minutes (14 days)
**Initial Entitlements:**
- Private Module Registry: Yes
- State Storage: Yes
- VCS Integrations: Yes
- Teams: No (Free tier)
- Sentinel Policies: No (Free tier)
- User Limit: 5 (Free tier)
The organization has been created with default settings. You're currently the owner of this organization. You can invite additional members and create workspaces within this organization.
```
```
**User**: Update my "terraform-demo" organization to enable cost estimation
**Claude**: I'll update the "terraform-demo" organization to enable cost estimation.
[Claude uses update_organization tool]
I've updated the "terraform-demo" organization to enable cost estimation. Here's the confirmation:
**Organization Updated:**
- Name: terraform-demo
- Email: [email protected]
- ID: org-Bzyc2JuegvVLAibn
**Updated Setting:**
- Cost Estimation: Enabled (previously Disabled)
**Note:**
Cost estimation is now enabled for this organization. This feature will:
- Show estimated costs for all Terraform plans
- Calculate cost impact of planned changes
- Display cost breakdowns by resource type
- Show cost trend analysis
The cost estimation feature uses cloud provider pricing data to estimate the cost of resources in your Terraform configurations. This can help teams understand the financial impact of their infrastructure changes before applying them.
```
```
**User**: Change the email address for "terraform-demo" to "[email protected]"
**Claude**: I'll update the email address for the "terraform-demo" organization.
[Claude uses update_organization tool]
I've successfully updated the email address for the "terraform-demo" organization. Here's the confirmation:
**Organization Updated:**
- Name: terraform-demo
- New Email: [email protected] (previously [email protected])
- ID: org-Bzyc2JuegvVLAibn
The organization's notification email has been changed to [email protected]. All system notifications and alerts will now be sent to this address.
```
```
**User**: Delete the "test-org" organization
**Claude**: I'll help you delete the "test-org" organization. This is a significant action that will permanently delete all resources.
[Claude uses delete_organization tool]
**Organization Deletion Successful**
The "test-org" organization has been permanently deleted. All associated resources have been removed:
- All workspaces and their configurations
- All state files and state versions
- All team memberships and user associations
- All variables, policy sets, and modules
The organization name "test-org" is now available for use again if you wish to recreate it in the future.
**Note**: This operation is irreversible. All data associated with the organization has been permanently deleted.
```
```
--------------------------------------------------------------------------------
/docs/tools/run.md:
--------------------------------------------------------------------------------
```markdown
# Run Tools
This module provides tools for managing runs in Terraform Cloud.
## Overview
Runs in Terraform Cloud represent the process of executing Terraform operations (plan, apply) on a workspace. These tools allow you to create, list, and manage runs, including applying, discarding, and canceling them as needed.
## API Reference
These tools interact with the Terraform Cloud Runs API:
- [Runs API Documentation](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/run)
- [Run Workflow](https://developer.hashicorp.com/terraform/cloud-docs/run/states)
## Tools Reference
### create_run
**Function:** `create_run(workspace_id: str, params: Optional[RunParams] = None) -> Dict[str, Any]`
**Description:** Creates and queues a new Terraform run in a workspace.
**Parameters:**
- `workspace_id` (str): The workspace ID to execute the run in (format: "ws-xxxxxxxx")
- `params` (RunParams, optional): Run configuration options:
  - `message`: Description of the run's purpose
  - `is_destroy`: Whether to destroy all resources (default: False)
  - `auto_apply`: Whether to auto-apply after a successful plan
  - `refresh`: Whether to refresh state before planning (default: True)
  - `refresh_only`: Only refresh the state without planning changes
  - `plan_only`: Create a speculative plan without applying
  - `target_addrs`: List of resource addresses to target
  - `replace_addrs`: List of resource addresses to force replacement
  - `variables`: Run-specific variable overrides
**Returns:** JSON response with run details including:
- Run ID and creation timestamp
- Status and status timestamps
- Relationships to workspace, configuration version, and plan
**Notes:**
- Requires "queue run" permission on the workspace
- Workspace must be unlocked to create a run
- Only one run can be active in a workspace at a time
- Run execution depends on the workspace's execution mode
### list_runs_in_workspace
**Function:** `list_runs_in_workspace(workspace_id: str, page_number: int = 1, page_size: int = 20, filter_operation: Optional[str] = None, filter_status: Optional[str] = None, filter_source: Optional[str] = None, filter_status_group: Optional[str] = None, filter_timeframe: Optional[str] = None, filter_agent_pool_names: Optional[str] = None, search_user: Optional[str] = None, search_commit: Optional[str] = None, search_basic: Optional[str] = None) -> Dict[str, Any]`
**Description:** Lists and filters runs in a specific workspace.
**Parameters:**
- `workspace_id` (str): The workspace ID (format: "ws-xxxxxxxx")
- `page_number` (int, optional): Page number to fetch (default: 1)
- `page_size` (int, optional): Number of results per page (default: 20)
- `filter_operation`: Filter by operation type (e.g., "plan,apply")
- `filter_status`: Filter by status (e.g., "pending,planning,applying")
- `filter_source`: Filter by source (e.g., "tfe-ui,tfe-api")
- `filter_status_group`: Filter by status group (e.g., "running,pending")
- And many other filtering options...
**Returns:** JSON response with paginated list of runs and metadata.
**Notes:**
- Requires "read runs" permission on the workspace
- Use multiple comma-separated values for filter parameters
- Results are sorted with most recent runs first
### list_runs_in_organization
**Function:** `list_runs_in_organization(organization: str, page_number: int = 1, page_size: int = 20, filter_operation: Optional[str] = None, filter_status: Optional[str] = None, filter_source: Optional[str] = None, filter_status_group: Optional[str] = None, filter_timeframe: Optional[str] = None, filter_agent_pool_names: Optional[str] = None, filter_workspace_names: Optional[str] = None, search_user: Optional[str] = None, search_commit: Optional[str] = None, search_basic: Optional[str] = None) -> Dict[str, Any]`
**Description:** Lists runs across all workspaces in an organization.
**Parameters:**
- `organization` (str): The organization name
- Same filtering parameters as list_runs_in_workspace plus:
- `filter_workspace_names` (str, optional): Filter by workspace names
**Returns:** JSON response with paginated list of runs across the organization.
**Notes:**
- Requires appropriate permissions on workspaces
- Useful for organization-wide auditing and monitoring
- Returns only runs from workspaces the user has access to
### get_run_details
**Function:** `get_run_details(run_id: str) -> Dict[str, Any]`
**Description:** Gets detailed information about a specific run.
**Parameters:**
- `run_id` (str): The ID of the run (format: "run-xxxxxxxx")
**Returns:** JSON response with comprehensive run details including:
- Run status and phase information
- Timestamps for each state transition
- Configuration information
- Relationships to plans, applies, and cost estimates
**Notes:**
- Requires "read runs" permission on the associated workspace
- Provides access to related resources via relationships
### apply_run
**Function:** `apply_run(run_id: str, comment: str = "") -> Dict[str, Any]`
**Description:** Confirms and applies a run that is paused waiting for confirmation.
**Parameters:**
- `run_id` (str): The ID of the run to apply
- `comment` (str, optional): Comment explaining the approval reason
**Returns:** JSON response with updated run details.
**Notes:**
- Requires "apply" permission on the workspace
- Run must be in "planned" status with changes to apply
- Comment is recorded in the audit log
### discard_run
**Function:** `discard_run(run_id: str, comment: str = "") -> Dict[str, Any]`
**Description:** Discards a run that is paused waiting for confirmation.
**Parameters:**
- `run_id` (str): The ID of the run to discard
- `comment` (str, optional): Comment explaining the discard reason
**Returns:** JSON response with updated run details showing discarded state.
**Notes:**
- Requires "apply" permission on the workspace
- Run must be in "planned" status to be discarded
- Discarded runs cannot be applied later
### cancel_run
**Function:** `cancel_run(run_id: str, comment: str = "") -> Dict[str, Any]`
**Description:** Gracefully cancels a run that is currently planning or applying.
**Parameters:**
- `run_id` (str): The ID of the run to cancel
- `comment` (str, optional): Comment explaining the cancellation reason
**Returns:** JSON response with updated run details showing canceled state.
**Notes:**
- Requires "cancel" permission on the workspace
- Run must be in an active state (planning, applying)
- Attempts to gracefully terminate the process
### force_cancel_run
**Function:** `force_cancel_run(run_id: str, comment: str = "") -> Dict[str, Any]`
**Description:** Force cancels a run immediately.
**Parameters:**
- `run_id` (str): The ID of the run to force cancel
- `comment` (str, optional): Comment explaining the force cancellation reason
**Returns:** JSON response with updated run details showing force-canceled state.
**Notes:**
- Requires "cancel" permission on the workspace
- Use only when normal cancellation doesn't work
- May result in inconsistent state if used during apply
- Immediately unlocks the workspace
### force_execute_run
**Function:** `force_execute_run(run_id: str) -> Dict[str, Any]`
**Description:** Cancels all prior runs to execute a specific run immediately.
**Parameters:**
- `run_id` (str): The ID of the run to execute
**Returns:** JSON response confirming the run has been promoted.
**Notes:**
- Requires "cancel" permission on the workspace
- Cancels all pending runs in the queue to prioritize this run
- Useful for urgent changes or when runs are queued
**Common Error Scenarios:**
| Error | Cause | Solution |
|-------|-------|----------|
| 404 | Run not found | Verify the run ID |
| 403 | Insufficient permissions | Ensure you have proper permissions |
| 409 | Run cannot be applied/discarded/canceled | Verify run is in correct state |
| 422 | Workspace locked by another run | Wait for the current run to finish or cancel it |
| 409 | Workspace already has active run | Cancel the active run or wait for it to complete |
```
--------------------------------------------------------------------------------
/docs/API_REFERENCES.md:
--------------------------------------------------------------------------------
```markdown
# Terraform Cloud API References
This document contains links to all available Terraform Cloud API reference documentation.
## Implementation Status
✅ = Implemented with response filtering optimization  
⏸️ = Not yet implemented  
💰 = Requires paid Terraform Cloud tier
## API Documentation
### Core
- [Overview](https://developer.hashicorp.com/terraform/cloud-docs/api-docs)
- [Changelog](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/changelog)
- [Stability Policy](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/stability-policy)
### APIs
- [✅] [Account](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/account) - User profile and authentication info
- [⏸️] [Agent Pools](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/agents) 💰 - Private agent management
- [⏸️] [Agent Tokens](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/agent-tokens) 💰 - Agent authentication
- [✅] [Applies](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/applies) - Infrastructure apply operations
- [⏸️] [Audit Trails](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/audit-trails) 💰💰💰 - Organization audit logs
- [⏸️] [Audit Trails Tokens](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/audit-trails-tokens) 💰💰💰 - Audit access tokens
- [✅] [Assessment Results](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/assessment-results) 💰💰💰 - Health assessments
- [⏸️] [Change Requests](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/change-requests) 💰💰💰 - Infrastructure change workflows
- [⏸️] [Comments](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/comments) - Run and plan comments
- [⏸️] [Configuration Versions](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/configuration-versions) - Terraform configuration uploads
- [✅] [Cost Estimates](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/cost-estimates) - Infrastructure cost projections
- [ ] [Explorer](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/explorer)
- [ ] [Feature Sets](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/feature-sets) 💰
- [ ] [GitHub App Installations](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/github-app-installations)
- [ ] [Invoices](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/invoices)
- [ ] [IP Ranges](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/ip-ranges)
- [ ] [No-Code Provisioning](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/no-code-provisioning)
- [ ] [Notification Configurations](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/notification-configurations)
- [ ] [OAuth Clients](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/oauth-clients)
- [ ] [OAuth Tokens](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/oauth-tokens)
- [✅] [Organizations](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/organizations) - Organization settings and management
- [⏸️] [Organization Memberships](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/organization-memberships) - User membership management
- [⏸️] [Organization Tags](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/organization-tags) - Organization-wide tagging
- [⏸️] [Organization Tokens](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/organization-tokens) - API token management
- [⏸️] [Plan Exports](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/plan-exports) - Plan data export functionality
- [✅] [Plans](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/plans) - Terraform execution plans
- [ ] [Policies](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/policies) 💰💰
- [ ] [Policy Checks](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/policy-checks) 💰💰
- [ ] [Policy Evaluations](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/policy-evaluations) 💰💰
- [ ] [Policy Sets](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/policy-sets) 💰💰
- [ ] [Policy Set Parameters](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/policy-set-params) 💰💰
- Private Registry
  - [ ] [Modules](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/private-registry/modules)
  - [ ] [Manage Module Versions](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/private-registry/manage-module-versions)
  - [ ] [Providers](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/private-registry/providers)
  - [ ] [Private Provider Versions and Platforms](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/private-registry/provider-versions-platforms)
  - [ ] [GPG Keys](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/private-registry/gpg-keys)
  - [ ] [Tests](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/private-registry/tests)
- [✅] [Projects](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/projects) - Workspace organization and management
- [⏸️] [Project Team Access](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/project-team-access) 💰 - Team permissions for projects
- [⏸️] [Reserved Tag Keys](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/reserved-tag-keys) - System-reserved tag management
- [✅] [Runs](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/run) - Terraform execution runs
- Run Tasks
  - [ ] [Run Tasks](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/run-tasks/run-tasks) 💰💰
  - [ ] [Stages and Results](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/run-tasks/run-task-stages-and-results) 💰💰
  - [ ] [Custom Integration](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/run-tasks/run-tasks-integration) 💰💰
- [ ] [Run Triggers](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/run-triggers)
- [ ] [SSH Keys](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/ssh-keys)
- [✅] [State Versions](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/state-versions) - Terraform state management
- [✅] [State Version Outputs](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/state-version-outputs) - State output values
- [⏸️] [Subscriptions](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/subscriptions) - Billing and subscription management
- [⏸️] [Team Membership](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/team-members) 💰 - Team member management
- [⏸️] [Team Tokens](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/team-tokens) 💰 - Team-level API tokens
- [⏸️] [Teams](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/teams) 💰 - Team management
- [⏸️] [User Tokens](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/user-tokens) - User API token management
- [⏸️] [Users](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/users) - User account management
- [✅] [Variables](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/variables) - Terraform variable management
- [✅] [Variable Sets](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/variable-sets) - Reusable variable collections
- [⏸️] [VCS Events](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/vcs-events) - Version control system events
- [✅] [Workspaces](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/workspaces) 💰 - Terraform workspace management
- [✅] [Workspace-Specific Variables](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/workspace-variables) - Workspace variable management
- [⏸️] [Workspace Team Access](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/team-access) 💰 - Team permissions for workspaces
- [⏸️] [Workspace Resources](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/workspace-resources) - Resource tracking and management
```
--------------------------------------------------------------------------------
/terraform_cloud_mcp/tools/state_versions.py:
--------------------------------------------------------------------------------
```python
"""Terraform Cloud state version management tools.
This module provides tools for working with state versions in Terraform Cloud.
It includes functions to retrieve, list, and create state versions.
Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/state-versions
"""
from typing import Optional
from ..api.client import api_request
from ..models.base import APIResponse
from ..models.state_versions import (
    CurrentStateVersionRequest,
    StateVersionCreateRequest,
    StateVersionListRequest,
    StateVersionParams,
    StateVersionRequest,
    StateVersionStatus,
)
from ..utils.decorators import handle_api_errors
from ..utils.payload import create_api_payload, add_relationship
from ..utils.request import query_params
@handle_api_errors
async def list_state_versions(
    organization: str,
    workspace_name: str,
    page_number: int = 1,
    page_size: int = 20,
    filter_status: Optional[str] = None,
) -> APIResponse:
    """List state versions in a workspace.
    Retrieves a paginated list of all state versions in a Terraform Cloud workspace.
    Results can be filtered using status to find specific state versions.
    API endpoint: GET /state-versions
    Args:
        organization: The name of the organization that owns the workspace
        workspace_name: The name of the workspace to list state versions from
        page_number: The page number to return (default: 1)
        page_size: The number of items per page (default: 20, max: 100)
        filter_status: Filter state versions by status: 'pending', 'finalized', or 'discarded'
    Returns:
        Paginated list of state versions with their configuration settings and metadata
    See:
        docs/tools/state_versions.md for reference documentation
    """
    # Convert filter_status string to enum if provided
    status_enum = None
    if filter_status:
        try:
            status_enum = StateVersionStatus(filter_status)
        except ValueError:
            valid_values = ", ".join([s.value for s in StateVersionStatus])
            raise ValueError(
                f"Invalid filter_status value: {filter_status}. Valid values: {valid_values}"
            )
    # Validate parameters
    params = StateVersionListRequest(
        filter_workspace_name=workspace_name,
        filter_organization_name=organization,
        page_number=page_number,
        page_size=page_size,
        filter_status=status_enum,
    )
    # Build query parameters using utility function
    query = query_params(params)
    # Make API request
    return await api_request("state-versions", params=query)
@handle_api_errors
async def get_current_state_version(workspace_id: str) -> APIResponse:
    """Get the current state version for a workspace.
    Retrieves the current state version for a workspace, which is the input
    state when running terraform operations.
    API endpoint: GET /workspaces/:workspace_id/current-state-version
    Args:
        workspace_id: The ID of the workspace (format: "ws-xxxxxxxx")
    Returns:
        The current state version including details and download URLs
    See:
        docs/tools/state_versions.md for reference documentation
    """
    # Validate parameters
    params = CurrentStateVersionRequest(workspace_id=workspace_id)
    # Make API request
    return await api_request(f"workspaces/{params.workspace_id}/current-state-version")
@handle_api_errors
async def get_state_version(state_version_id: str) -> APIResponse:
    """Get details for a specific state version.
    Retrieves comprehensive information about a state version including its status,
    download URLs, and resource information.
    API endpoint: GET /state-versions/:state_version_id
    Args:
        state_version_id: The ID of the state version to retrieve (format: "sv-xxxxxxxx")
    Returns:
        State version details including status, timestamps, and resource metadata
    See:
        docs/tools/state_versions.md for reference documentation
    """
    # Validate parameters
    params = StateVersionRequest(state_version_id=state_version_id)
    # Make API request
    return await api_request(f"state-versions/{params.state_version_id}")
@handle_api_errors
async def create_state_version(
    workspace_id: str,
    serial: int,
    md5: str,
    params: Optional[StateVersionParams] = None,
) -> APIResponse:
    """Create a state version in a workspace.
    Creates a new state version and sets it as the current state version for the
    given workspace. The workspace must be locked by the user creating a state version.
    This is most useful for migrating existing state from Terraform Community edition
    into a new HCP Terraform workspace.
    API endpoint: POST /workspaces/:workspace_id/state-versions
    Args:
        workspace_id: The ID of the workspace (format: "ws-xxxxxxxx")
        serial: The serial number of this state instance
        md5: An MD5 hash of the raw state version
        params: Additional state version parameters (optional):
            - state: Base64 encoded raw state file
            - lineage: Lineage of the state version
            - json_state: Base64 encoded JSON state
            - json_state_outputs: Base64 encoded JSON state outputs
            - run_id: The ID of the run to associate with the state version
    Returns:
        The created state version data including download URLs and status information
    See:
        docs/tools/state_versions.md for reference documentation
    """
    # Extract parameters from params object
    param_dict = params.model_dump(exclude_none=True, by_alias=False) if params else {}
    # Add required parameters
    param_dict["serial"] = serial
    param_dict["md5"] = md5
    # Create request using Pydantic model
    request_params = StateVersionCreateRequest(workspace_id=workspace_id, **param_dict)
    # Create API payload using utility function
    payload = create_api_payload(
        resource_type="state-versions",
        model=request_params,
        exclude_fields={"workspace_id"},
    )
    # Add relationship if run_id is provided
    if param_dict.get("run_id"):
        payload = add_relationship(
            payload=payload,
            relation_name="run",
            resource_type="runs",
            resource_id=param_dict["run_id"],
        )
    # Make API request
    return await api_request(
        f"workspaces/{request_params.workspace_id}/state-versions",
        method="POST",
        data=payload,
    )
@handle_api_errors
async def download_state_file(
    state_version_id: str, json_format: bool = False
) -> APIResponse:
    """Download the state file content.
    Retrieves the raw state file or JSON formatted state file for a specific state version.
    API endpoint: Uses the hosted URLs from GET /state-versions/:state_version_id
    Args:
        state_version_id: The ID of the state version (format: "sv-xxxxxxxx")
        json_format: Whether to download the JSON formatted state (default: False)
    Returns:
        The raw state file content or JSON formatted state content
    See:
        docs/tools/state_versions.md for reference documentation
    """
    # Validate parameters
    params = StateVersionRequest(state_version_id=state_version_id)
    # First get state version details to get the download URL
    state_version = await api_request(f"state-versions/{params.state_version_id}")
    # Determine which URL to use based on format request
    url_attr = (
        "hosted-json-state-download-url" if json_format else "hosted-state-download-url"
    )
    download_url = state_version.get("data", {}).get("attributes", {}).get(url_attr)
    # Check if URL is available
    if not download_url:
        if json_format:
            return {
                "error": "JSON state download URL not available. This may be because the state was not created with Terraform 1.3+"
            }
        else:
            return {"error": "State download URL not available for this state version"}
    # Use the enhanced api_request to fetch state from the external URL
    return await api_request(download_url, external_url=True, accept_text=True)
```
--------------------------------------------------------------------------------
/docs/models/organization.md:
--------------------------------------------------------------------------------
```markdown
# Organization Models
This document describes the data models used for organization operations in Terraform Cloud.
## Overview
Organization models provide structure and validation for interacting with the Terraform Cloud Organizations API. These models define organization settings, authentication policies, and default configurations for workspaces.
## Models Reference
### CollaboratorAuthPolicy
**Type:** Enum (string)
**Description:** Authentication policy options for organization collaborators.
**Values:**
- `password`: Password-only authentication is allowed
- `two_factor_mandatory`: Two-factor authentication is required for all users
**JSON representation:**
```json
{
  "collaborator-auth-policy": "two_factor_mandatory"
}
```
**Usage Context:**
Used to enforce security policies across an organization. Setting this to `two_factor_mandatory` requires all users to have 2FA enabled before they can access organization resources.
### ExecutionMode
**Type:** Enum (string)
**Description:** Execution mode options for workspaces and organizations.
**Values:**
- `remote`: Terraform runs on Terraform Cloud's infrastructure
- `local`: Terraform runs on your local machine
- `agent`: Terraform runs on your own infrastructure using an agent
**JSON representation:**
```json
{
  "default-execution-mode": "remote"
}
```
**Usage Context:**
Used to define how Terraform operations are executed for workspaces in an organization.
### OrganizationParams
**Type:** Object
**Description:** Parameters for organization operations. Used to configure organization settings when creating or updating organizations.
**Fields:**
- `name` (string, optional): Name of the organization (min length: 3, must match pattern: ^[a-z0-9][-a-z0-9_]*[a-z0-9]$)
- `email` (string, optional): Admin email address (must be valid email format)
- `collaborator_auth_policy` (CollaboratorAuthPolicy, optional): Authentication policy (password or two_factor_mandatory)
- `session_timeout` (integer, optional): Session timeout after inactivity in minutes (default: 20160)
- `session_remember` (integer, optional): Session total expiration time in minutes (default: 20160)
- `cost_estimation_enabled` (boolean, optional): Whether to enable cost estimation (default: false)
- `default_execution_mode` (ExecutionMode, optional): Default execution mode for workspaces (remote, local, agent)
- `aggregated_commit_status_enabled` (boolean, optional): Whether to aggregate VCS status updates (default: true)
- `speculative_plan_management_enabled` (boolean, optional): Whether to auto-cancel unused speculative plans (default: true)
- `assessments_enforced` (boolean, optional): Whether to enforce health assessments for all workspaces (default: false)
- `allow_force_delete_workspaces` (boolean, optional): Allow deleting workspaces with resources (default: false)
- `default_agent_pool_id` (string, optional): Default agent pool ID (required when default_execution_mode=agent)
- `owners_team_saml_role_id` (string, optional): The SAML role for owners team
- `send_passing_statuses_for_untriggered_speculative_plans` (boolean, optional): Whether to send VCS status for untriggered plans (default: false)
**JSON representation:**
```json
{
  "data": {
    "type": "organizations",
    "attributes": {
      "name": "my-organization",
      "email": "[email protected]",
      "session-timeout": 60,
      "session-remember": 1440,
      "collaborator-auth-policy": "two_factor_mandatory",
      "cost-estimation-enabled": true,
      "default-execution-mode": "remote",
      "assessments-enforced": true
    }
  }
}
```
**Notes:**
- Field names in JSON use kebab-case format (e.g., "default-execution-mode")
- Field names in the model use snake_case format (e.g., default_execution_mode)
- All fields are optional when updating (PATCH) but name and email are required when creating (POST)
**Validation Rules:**
- Organization names must be 3 or more characters
- Organization names must start and end with a lowercase letter or number
- Organization names may contain lowercase letters, numbers, hyphens, and underscores
- Email addresses must be valid format
- Session timeouts and remembers must be between 1 and 43200 minutes (30 days)
### OrganizationDetailsRequest
**Type:** Request Validation Model
**Description:** Used to validate parameters for retrieving organization details.
**Fields:**
- `organization` (string, required): The name of the organization to retrieve details for
**Used by:**
- `get_organization_details` tool function to validate the organization name parameter
### OrganizationEntitlementsRequest
**Type:** Request Validation Model
**Description:** Used to validate parameters for retrieving organization entitlements.
**Fields:**
- `organization` (string, required): The name of the organization to retrieve entitlements for
**Used by:**
- `get_organization_entitlements` tool function to validate the organization name parameter
### OrganizationDeleteRequest
**Type:** Request Validation Model
**Description:** Used to validate parameters for deleting an organization.
**Fields:**
- `organization` (string, required): The name of the organization to delete
**Used by:**
- `delete_organization` tool function to validate the organization name parameter
### OrganizationListRequest
**Type:** Request Validation Model
**Description:** Used to validate parameters for listing organizations.
**Fields:**
- `page_number` (int, optional): Page number to return (default: 1)
- `page_size` (int, optional): Number of items per page (default: 20)
- `q` (str, optional): Search query to filter by name and email
- `query_email` (str, optional): Search query to filter by email only
- `query_name` (str, optional): Search query to filter by name only
**Used by:**
- `list_organizations` tool function to validate pagination and search parameters
### OrganizationCreateRequest
**Type:** Request Validation Model
**Description:** Used to validate parameters for creating an organization.
**Fields:**
- `name` (string, required): The name for the organization
- `email` (string, required): Admin email address
- `params` (OrganizationParams, optional): Additional configuration options
**Used by:**
- `create_organization` tool function to validate organization creation parameters
### OrganizationUpdateRequest
**Type:** Request Validation Model
**Description:** Used to validate parameters for updating an organization.
**Fields:**
- `organization` (string, required): The name of the organization to update
- `params` (OrganizationParams, optional): Settings to update
**Used by:**
- `update_organization` tool function to validate organization update parameters
## API Response Structure
### Organization Details Response
```json
{
  "data": {
    "id": "org-ABcd1234",
    "type": "organizations",
    "attributes": {
      "name": "my-organization",
      "external-id": "org-12345",
      "created-at": "2023-01-15T12:34:56Z",
      "email": "[email protected]",
      "session-timeout": 60,
      "session-remember": 1440,
      "collaborator-auth-policy": "two_factor_mandatory",
      "cost-estimation-enabled": true,
      "default-execution-mode": "remote",
      "permissions": {
        "can-destroy": true,
        "can-update": true
      }
    },
    "relationships": {
      "subscription": {
        "data": {
          "id": "sub-XYZ789",
          "type": "subscriptions"
        }
      }
    },
    "links": {
      "self": "/api/v2/organizations/my-organization"
    }
  }
}
```
### Organization Entitlements Response
```json
{
  "data": {
    "id": "org-ABcd1234",
    "type": "organization-entitlements",
    "attributes": {
      "cost-estimation": true,
      "operations": true,
      "private-modules": true,
      "sentinel": true,
      "teams": true,
      "vcs-integrations": true,
      "usage-reporting": true,
      "self-serve-billing": true,
      "state-storage": {
        "unlimited": true
      },
      "teams": {
        "limit": 5,
        "used": 3
      },
      "private-module-registry": {
        "limit": 100,
        "used": 25
      }
    }
  }
}
```
## Related Resources
- [Organization Tools](../tools/organization.md)
- [Workspace Models](workspace.md)
- [Terraform Cloud API - Organizations](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/organizations)
```
--------------------------------------------------------------------------------
/terraform_cloud_mcp/tools/organizations.py:
--------------------------------------------------------------------------------
```python
"""Organization management tools for Terraform Cloud MCP
This module implements the /organizations endpoints of the Terraform Cloud API.
Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/organizations
"""
from typing import Optional
from ..api.client import api_request
from ..utils.decorators import handle_api_errors
from ..utils.payload import create_api_payload
from ..utils.request import query_params
from ..models.base import APIResponse
from ..models.organizations import (
    OrganizationCreateRequest,
    OrganizationUpdateRequest,
    OrganizationListRequest,
    OrganizationParams,
    OrganizationDetailsRequest,
    OrganizationEntitlementsRequest,
    OrganizationDeleteRequest,
)
@handle_api_errors
async def get_organization_details(organization: str) -> APIResponse:
    """Get details for a specific organization
    Retrieves comprehensive information about an organization including settings,
    email contact info, and configuration defaults.
    API endpoint: GET /organizations/{organization}
    Args:
        organization: The organization name to retrieve details for (required)
    Returns:
        Organization details including name, email, settings and configuration
    See:
        docs/tools/organization.md for reference documentation
    """
    request = OrganizationDetailsRequest(organization=organization)
    return await api_request(f"organizations/{request.organization}")
@handle_api_errors
async def get_organization_entitlements(organization: str) -> APIResponse:
    """Show entitlement set for organization features
    Retrieves information about available features and capabilities based on
    the organization's subscription tier.
    API endpoint: GET /organizations/{organization}/entitlement-set
    Args:
        organization: The organization name to retrieve entitlements for (required)
    Returns:
        Entitlement set details including feature limits and subscription information
    See:
        docs/tools/organization.md for reference documentation
    """
    request = OrganizationEntitlementsRequest(organization=organization)
    return await api_request(f"organizations/{request.organization}/entitlement-set")
@handle_api_errors
async def list_organizations(
    page_number: int = 1,
    page_size: int = 20,
    q: Optional[str] = None,
    query_email: Optional[str] = None,
    query_name: Optional[str] = None,
) -> APIResponse:
    """List organizations with filtering options
    Retrieves a paginated list of organizations the current user has access to,
    with options to search by name or email address.
    API endpoint: GET /organizations
    Args:
        page_number: Page number to fetch (default: 1)
        page_size: Number of results per page (default: 20)
        q: Search query to filter by name and email
        query_email: Search query to filter by email only
        query_name: Search query to filter by name only
    Returns:
        List of organizations with metadata and pagination information
    See:
        docs/tools/organization.md for reference documentation
    """
    request = OrganizationListRequest(
        page_number=page_number,
        page_size=page_size,
        q=q,
        query_email=query_email,
        query_name=query_name,
    )
    # Get all query parameters - now automatically handles query_email and query_name
    params = query_params(request)
    return await api_request("organizations", params=params)
@handle_api_errors
async def create_organization(
    name: str, email: str, params: Optional[OrganizationParams] = None
) -> APIResponse:
    """Create a new organization in Terraform Cloud
    Creates a new organization with the given name and email, allowing workspaces
    and teams to be created within it. This is the first step in setting up a new
    environment in Terraform Cloud.
    API endpoint: POST /organizations
    Args:
        name: The name of the organization (required)
        email: Admin email address (required)
        params: Additional organization settings:
            - collaborator_auth_policy: Authentication policy (password or two_factor_mandatory)
            - session_timeout: Session timeout after inactivity in minutes
            - session_remember: Session total expiration time in minutes
            - cost_estimation_enabled: Whether to enable cost estimation for workspaces
            - default_execution_mode: Default workspace execution mode (remote, local, agent)
            - aggregated_commit_status_enabled: Whether to aggregate VCS status updates
            - speculative_plan_management_enabled: Whether to auto-cancel unused speculative plans
            - assessments_enforced: Whether to enforce health assessments for all workspaces
            - allow_force_delete_workspaces: Whether to allow deleting workspaces with resources
            - default_agent_pool_id: Default agent pool ID (required when using agent mode)
    Returns:
        The created organization details including ID and created timestamp
    See:
        docs/tools/organization.md for reference documentation
    """
    # Extract parameters from the params object if provided
    param_dict = params.model_dump(exclude_none=True) if params else {}
    # Create request using Pydantic model with defaults
    request = OrganizationCreateRequest(name=name, email=email, **param_dict)
    # Create API payload using utility function
    payload = create_api_payload(resource_type="organizations", model=request)
    # Make the API request
    return await api_request("organizations", method="POST", data=payload)
@handle_api_errors
async def update_organization(
    organization: str, params: Optional[OrganizationParams] = None
) -> APIResponse:
    """Update an existing organization in Terraform Cloud
    Modifies organization settings such as email contact, authentication policy,
    or other configuration options. Only specified attributes will be updated.
    API endpoint: PATCH /organizations/{organization}
    Args:
        organization: The name of the organization to update (required)
        params: Organization parameters to update:
            - email: Admin email address for the organization
            - collaborator_auth_policy: Authentication policy (password or two_factor_mandatory)
            - session_timeout: Session timeout after inactivity in minutes
            - session_remember: Session total expiration time in minutes
            - cost_estimation_enabled: Whether to enable cost estimation for workspaces
            - default_execution_mode: Default workspace execution mode (remote, local, agent)
            - aggregated_commit_status_enabled: Whether to aggregate VCS status updates
            - speculative_plan_management_enabled: Whether to auto-cancel unused speculative plans
            - assessments_enforced: Whether to enforce health assessments for all workspaces
            - allow_force_delete_workspaces: Whether to allow deleting workspaces with resources
    Returns:
        The updated organization with all current settings
    See:
        docs/tools/organization.md for reference documentation
    """
    # Extract parameters from the params object if provided
    param_dict = params.model_dump(exclude_none=True) if params else {}
    # Create request using Pydantic model
    request = OrganizationUpdateRequest(organization=organization, **param_dict)
    # Create API payload using utility function
    payload = create_api_payload(
        resource_type="organizations", model=request, exclude_fields={"organization"}
    )
    # Make the API request
    return await api_request(
        f"organizations/{organization}", method="PATCH", data=payload
    )
@handle_api_errors
async def delete_organization(organization: str) -> APIResponse:
    """Delete an organization from Terraform Cloud
    Permanently removes an organization including all its workspaces, teams, and resources.
    This action cannot be undone. Organization names are globally unique and cannot be
    recreated with the same name later.
    API endpoint: DELETE /organizations/{organization}
    Args:
        organization: The name of the organization to delete (required)
    Returns:
        Success confirmation (HTTP 204 No Content) or error details
    See:
        docs/tools/organization.md for reference documentation
    """
    # Create request using Pydantic model
    request = OrganizationDeleteRequest(organization=organization)
    # Make API request
    return await api_request(f"organizations/{request.organization}", method="DELETE")
```
--------------------------------------------------------------------------------
/terraform_cloud_mcp/utils/filters.py:
--------------------------------------------------------------------------------
```python
"""Response filtering utilities for Terraform Cloud MCP
This module provides smart filtering of API responses to reduce token usage
while preserving essential data for MCP operations.
"""
from typing import Dict, Any, Callable, Union
from ..models.filters import FilterConfig, OperationType, ResourceType, FilterRequest
from ..configs.filter_configs import (
    FILTER_CONFIGS,
    RESOURCE_TYPE_MAP,
    PATH_PATTERNS,
    DATA_TYPE_MAP,
)
def filter_response(
    data: Dict[str, Any],
    resource_type: Union[str, ResourceType],
    operation_type: Union[str, OperationType] = OperationType.READ,
) -> Dict[str, Any]:
    """Filter API response to remove unnecessary fields."""
    if not isinstance(data, dict) or "data" not in data:
        return data
    # Shallow copy response for performance
    filtered_data = data.copy()
    if isinstance(filtered_data["data"], list):
        filtered_data["data"] = [
            item.copy() if isinstance(item, dict) else item
            for item in filtered_data["data"]
        ]
    elif isinstance(filtered_data["data"], dict):
        filtered_data["data"] = filtered_data["data"].copy()
    for key in ("meta", "links"):
        if key in filtered_data:
            filtered_data[key] = filtered_data[key].copy()
    # Normalize types to enums
    if isinstance(resource_type, str):
        normalized_type = RESOURCE_TYPE_MAP.get(resource_type, ResourceType.GENERIC)
    elif isinstance(resource_type, ResourceType):
        normalized_type = resource_type
    else:
        raise ValueError(f"Invalid resource_type: {resource_type}")
    if isinstance(operation_type, str):
        try:
            operation_enum = OperationType(operation_type)
        except ValueError:
            raise ValueError(f"Invalid operation_type: {operation_type}")
    elif isinstance(operation_type, OperationType):
        operation_enum = operation_type
    else:
        raise ValueError(f"Invalid operation_type: {operation_type}")
    # Handle single item or list of items
    if isinstance(filtered_data["data"], list):
        for item in filtered_data["data"]:
            _filter_item_attributes(item, normalized_type, operation_enum)
    else:
        _filter_item_attributes(filtered_data["data"], normalized_type, operation_enum)
    # Filter top-level metadata for list operations
    if operation_enum == OperationType.LIST:
        _filter_list_metadata(filtered_data)
    return filtered_data
def filter_with_request(data: Dict[str, Any], request: FilterRequest) -> Dict[str, Any]:
    """Filter API response using a FilterRequest object."""
    # Apply base filtering
    filtered_data = filter_response(data, request.resource_type, request.operation_type)
    # Apply custom field removals if specified
    if request.custom_fields:
        if isinstance(filtered_data["data"], list):
            for item in filtered_data["data"]:
                _remove_custom_fields(item, request.custom_fields)
        else:
            _remove_custom_fields(filtered_data["data"], request.custom_fields)
    # Restore preserved fields if specified
    if request.preserve_fields:
        # This would require access to original data - placeholder for future implementation
        pass
    return filtered_data
def _filter_item_attributes(
    item: Dict[str, Any], resource_type: ResourceType, operation_type: OperationType
) -> None:
    """Filter individual item attributes in-place."""
    if "attributes" not in item:
        return
    # Shallow copy attributes to avoid modifying original
    if not isinstance(item["attributes"], dict):
        return
    item["attributes"] = item["attributes"].copy()
    attrs = item["attributes"]
    config = FILTER_CONFIGS.get(resource_type, FilterConfig())
    # Build set of fields to remove using Pydantic model
    fields_to_remove = set(config.always_remove)
    if operation_type == OperationType.READ:
        fields_to_remove.update(config.read_remove)
    elif operation_type == OperationType.LIST:
        fields_to_remove.update(config.list_remove)
    # Remove specified fields
    for field in fields_to_remove:
        attrs.pop(field, None)
    # Handle relationships
    if "relationships" in item:
        # Shallow copy relationships to avoid modifying original
        item["relationships"] = item["relationships"].copy()
        _filter_relationships(item["relationships"], resource_type, operation_type)
    # Remove item-level links
    item.pop("links", None)
def _filter_relationships(
    relationships: Dict[str, Any],
    resource_type: ResourceType,
    operation_type: OperationType,
) -> None:
    """Filter relationships in-place."""
    config = FILTER_CONFIGS.get(resource_type, FilterConfig())
    essential_rels = config.essential_relationships
    if operation_type == OperationType.READ and essential_rels:
        # Keep only essential relationships
        keys_to_remove = [k for k in relationships.keys() if k not in essential_rels]
        for key in keys_to_remove:
            relationships.pop(key, None)
    # Remove links from all remaining relationships
    for key, rel_data in relationships.items():
        if isinstance(rel_data, dict):
            # Shallow copy individual relationship to avoid modifying original
            relationships[key] = rel_data.copy()
            relationships[key].pop("links", None)
    # Remove empty relationships
    if not relationships:
        return
def _remove_custom_fields(item: Dict[str, Any], custom_fields: set[str]) -> None:
    """Remove custom fields from an item."""
    if "attributes" not in item or not isinstance(item["attributes"], dict):
        return
    attrs = item["attributes"]
    for field in custom_fields:
        attrs.pop(field, None)
def _filter_list_metadata(data: Dict[str, Any]) -> None:
    """Filter list response metadata and pagination links in-place."""
    # Filter metadata
    if "meta" in data:
        meta = data["meta"]
        if "pagination" in meta and isinstance(meta["pagination"], dict):
            pagination = meta["pagination"]
            meta["pagination"] = {
                "current-page": pagination.get("current-page"),
                "total-pages": pagination.get("total-pages"),
                "total-count": pagination.get("total-count"),
            }
        if "status-counts" in meta and isinstance(meta["status-counts"], dict):
            status_counts = meta["status-counts"]
            if "total" in status_counts:
                meta["status-counts"] = {"total": status_counts["total"]}
            else:
                meta.pop("status-counts", None)
    # Filter pagination links
    if "links" in data and isinstance(data["links"], dict):
        links = data["links"]
        essential_links = {
            k: links[k] for k in ["next", "prev", "first", "last"] if k in links
        }
        data["links"] = essential_links
def get_response_filter(resource_type: Union[str, ResourceType]) -> Callable:
    """Get the appropriate filter function for a resource type."""
    def resource_filter(
        data: Dict[str, Any],
        operation_type: Union[str, OperationType] = OperationType.READ,
    ) -> Dict[str, Any]:
        return filter_response(data, resource_type, operation_type)
    return resource_filter
def should_filter_response(path: str, method: str) -> bool:
    """Determine if a response should be filtered based on the API path and method."""
    # Only filter GET requests
    if method.upper() != "GET":
        return False
    # Don't filter log or download endpoints
    skip_terms = ["log", "download", "json-output", "content"]
    return not any(term in path.lower() for term in skip_terms)
def detect_resource_type(path: str, data: Dict[str, Any]) -> ResourceType:
    """Detect resource type from API path and response data."""
    # Path-based detection
    for pattern, resource_type in PATH_PATTERNS:
        if pattern in path:
            return resource_type
    # Data-based fallback
    if isinstance(data, dict) and "data" in data:
        data_item = data["data"]
        if isinstance(data_item, list) and data_item:
            data_type = data_item[0].get("type", "unknown")
        elif isinstance(data_item, dict):
            data_type = data_item.get("type", "unknown")
        else:
            return ResourceType.GENERIC
        if data_type in DATA_TYPE_MAP:
            return DATA_TYPE_MAP[data_type]
        for resource_type in ResourceType:
            if resource_type.value == data_type:
                return resource_type
    return ResourceType.GENERIC
def detect_operation_type(path: str, method: str) -> OperationType:
    """Detect operation type from API path and method."""
    if method.upper() == "GET":
        id_prefixes = ["ws-", "run-", "org-", "prj-", "var-"]
        has_resource_id = any(
            segment.startswith(prefix)
            for segment in path.split("/")
            for prefix in id_prefixes
        )
        return OperationType.READ if has_resource_id else OperationType.LIST
    return OperationType.MANAGE
```
--------------------------------------------------------------------------------
/docs/conversations/workspace-management-conversation.md:
--------------------------------------------------------------------------------
```markdown
# Workspace Management - Example Conversations
This document shows example conversations with Claude using the Terraform Cloud MCP workspace management features.
## Listing and Finding Workspaces
```
User: List all my workspaces in "example-org"
Claude: I'll list all workspaces in the "example-org" organization.
[Claude uses list_workspaces tool]
Here are all the workspaces in "example-org":
1. **production** - Production environment (Terraform v1.5.2)
   - Status: Active, Unlocked
   - Last run: Applied 2 days ago
   - VCS: Connected to GitHub (main branch)
2. **staging** - Staging environment (Terraform v1.5.2)
   - Status: Active, Unlocked
   - Last run: Applied 6 hours ago
   - VCS: Connected to GitHub (develop branch)
3. **dev-environment** - Development workspace (Terraform v1.5.3)
   - Status: Active, Unlocked
   - Last run: Planning 5 minutes ago
   - VCS: Connected to GitHub (feature/new-db branch)
4. **network-infra** - Network infrastructure (Terraform v1.4.6)
   - Status: Active, Locked by [email protected]
   - Last run: Applied 2 weeks ago
   - VCS: Not connected
5. **security-baseline** - Security controls (Terraform v1.5.0)
   - Status: Active, Unlocked
   - Last run: Discarded 3 days ago
   - VCS: Connected to GitHub (main branch)
User: Find all workspaces in "example-org" with names containing "prod"
Claude: I'll search for workspaces in the "example-org" organization that have "prod" in their names.
[Claude uses list_workspaces tool]
I found the following workspaces in "example-org" with "prod" in their names:
1. **production** - Production environment (Terraform v1.5.2)
   - Status: Active, Unlocked
   - Last run: Applied 2 days ago
   - VCS: Connected to GitHub (main branch)
2. **products-api** - Products API infrastructure (Terraform v1.5.0)
   - Status: Active, Unlocked
   - Last run: Applied 5 days ago
   - VCS: Connected to GitHub (release/v2 branch)
3. **prod-db-cluster** - Production database cluster (Terraform v1.5.1)
   - Status: Active, Locked by system
   - Last run: Errored 12 hours ago
   - VCS: Connected to GitHub (main branch)
User: List workspaces in "example-org" with the "environment:production" tag
Claude: I'll find workspaces in "example-org" that have the "environment:production" tag.
[Claude uses list_workspaces tool]
Here are the workspaces in "example-org" tagged with "environment:production":
1. **production** - Production environment (Terraform v1.5.2)
   - Tags: environment:production, region:us-west-2
   - Status: Active, Unlocked
   - Last run: Applied 2 days ago
2. **prod-db-cluster** - Production database cluster (Terraform v1.5.1)
   - Tags: environment:production, service:database, region:us-west-2
   - Status: Active, Locked by system
   - Last run: Errored 12 hours ago
3. **frontend-prod** - Production frontend resources (Terraform v1.5.0)
   - Tags: environment:production, service:frontend, region:us-west-2
   - Status: Active, Unlocked
   - Last run: Applied 1 week ago
4. **api-gateway-prod** - Production API gateway (Terraform v1.4.6)
   - Tags: environment:production, service:api, region:us-west-2
   - Status: Active, Unlocked
   - Last run: Applied 3 days ago
```
## Workspace Details and Management
```
User: Get details for the "production" workspace in "example-org"
Claude: I'll retrieve detailed information about the "production" workspace in "example-org".
[Claude uses get_workspace_details tool]
Here are the details for the "production" workspace in "example-org":
**Basic Information:**
- Name: production
- Description: Production environment for main application
- ID: ws-7H9QJDXpY98V4w2Y
- Created: January 15, 2023
- Updated: May 3, 2024
**Terraform Configuration:**
- Version: 1.5.2
- Working Directory: /terraform
- Execution Mode: Remote
- Auto Apply: False
- Terraform State Version: 4
- Resource Count: 143
**VCS Integration:**
- Repository: github.com/example-org/main-infrastructure
- Branch: main
- Working Directory: /terraform
- Trigger Prefixes: ["/modules", "/environments/prod"]
**Current Status:**
- State: Active
- Lock Status: Unlocked
- Current Run: None
- Last Run: run-bP9xtXmqFoQNfjH7 (Applied 2 days ago)
**Tags:**
- environment:production
- region:us-west-2
- team:infrastructure
User: Create a new workspace in "example-org" called "staging" with auto-apply enabled
Claude: I'll create a new workspace called "staging" in the "example-org" organization with auto-apply enabled.
[Claude uses create_workspace tool]
I've successfully created the "staging" workspace in "example-org" with auto-apply enabled. Here are the details:
**Workspace Details:**
- Name: staging
- Description: Staging environment for testing
- ID: ws-2VcPMzXqD75r8nL9
- Created: Just now
- Terraform Version: 1.5.2
- Working Directory: /terraform/staging
- Execution Mode: Remote
- Auto Apply: Enabled
The workspace is now ready to use. Since auto-apply is enabled, any successful plans will automatically be applied without requiring manual approval.
User: Update my "staging" workspace to use Terraform version 1.5.0
Claude: I'll update the "staging" workspace in "example-org" to use Terraform version 1.5.0.
[Claude uses update_workspace tool]
I've successfully updated the "staging" workspace to use Terraform version 1.5.0. Here's the confirmation:
**Updated Workspace Details:**
- Name: staging
- ID: ws-2VcPMzXqD75r8nL9
- Terraform Version: 1.5.0 (previously 1.5.2)
- Working Directory: /terraform/staging
- Execution Mode: Remote
- Auto Apply: Enabled
The change has been applied immediately. Any new runs in this workspace will now use Terraform 1.5.0 instead of 1.5.2.
```
## Workspace Deletion
```
User: Delete my "dev-test" workspace in "example-org"
Claude: I'll help you delete the "dev-test" workspace in "example-org". This is a significant action that requires confirmation.
[Claude uses delete_workspace tool]
**WARNING: Confirmation Required**
You've requested to delete the "dev-test" workspace in "example-org". This action:
- Will permanently delete the workspace and all its configurations
- Will delete all workspace variables
- Will delete all state versions and state snapshots
- Cannot be undone
To confirm deletion, please explicitly confirm that you want to proceed.
User: Yes, please delete the workspace
Claude: I'll proceed with deleting the "dev-test" workspace as confirmed.
[Claude uses delete_workspace tool]
**Deletion Successful**
The "dev-test" workspace in "example-org" has been permanently deleted. All associated resources, variables, and state files have been removed.
User: Delete my "staging" workspace but only if it's not managing any resources
Claude: I'll help you safely delete the "staging" workspace, but only if it's not currently managing any resources.
[Claude uses safe_delete_workspace tool]
**WARNING: Confirmation Required**
You've requested to safely delete the "staging" workspace in "example-org". This operation:
- Will only proceed if the workspace is not managing any resources
- Will fail if there are resources under management
- Will delete all workspace configurations and variables
- Cannot be undone
To proceed with the safe deletion, please confirm.
User: Yes, proceed with the safe delete
Claude: I'll proceed with the safe deletion of the "staging" workspace.
[Claude uses safe_delete_workspace tool]
**Safe Deletion Successful**
The "staging" workspace in "example-org" has been safely deleted. The workspace had no infrastructure resources under management, so the deletion was allowed to proceed.
All workspace configurations, variables, and settings have been removed. The operation completed successfully.
```
## Workspace Locking and Unlocking
```
User: Lock my "production" workspace while we do maintenance
Claude: I'll lock the "production" workspace to prevent any changes during your maintenance period.
[Claude uses lock_workspace tool]
**Workspace Locked**
I've successfully locked the "production" workspace in "example-org". While locked:
- No new runs can be queued
- Existing runs in progress will continue until completion
- The workspace will remain locked until explicitly unlocked
**Lock Details:**
- Workspace: production
- Locked by: Your account ([email protected])
- Reason: Scheduled maintenance in progress
- Locked at: [Current timestamp]
The workspace will remain locked until you request it to be unlocked.
User: The maintenance is complete, unlock "production" now
Claude: I'll unlock the "production" workspace now that maintenance is complete.
[Claude uses unlock_workspace tool]
**Workspace Unlocked**
The "production" workspace in "example-org" has been successfully unlocked. Normal operations can now resume:
- New runs can be queued
- Automatic runs will trigger as configured
- VCS-triggered runs will process normally
The workspace is now available for use.
User: Someone left "staging" locked and they're out today. Can you force unlock it?
Claude: I'll force unlock the "staging" workspace since the person who locked it is unavailable today.
[Claude uses force_unlock_workspace tool]
**Workspace Force Unlocked**
I've successfully force unlocked the "staging" workspace in "example-org". The workspace was previously locked by [email protected] with reason "Testing configuration changes".
The force unlock has:
- Removed the lock regardless of who placed it
- Made the workspace available for immediate use
- Allowed new runs to be queued
The workspace is now unlocked and available for normal operations. Note that force unlocking should only be done in situations like this where the original locker is unavailable.
```
```
--------------------------------------------------------------------------------
/terraform_cloud_mcp/models/organizations.py:
--------------------------------------------------------------------------------
```python
"""Organization models for Terraform Cloud API
This module contains models for Terraform Cloud organization-related requests.
Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/organizations
"""
from typing import Optional, Union
from pydantic import Field
from .base import APIRequest, CollaboratorAuthPolicy, ExecutionMode
class OrganizationDetailsRequest(APIRequest):
    """Request model for getting organization details.
    This model is used for the GET /organizations/{name} endpoint. The endpoint
    returns detailed information about an organization including its name,
    external ID, created date, and all organization-level settings.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/organizations#show-an-organization
    See:
        docs/models/organization.md for reference
    """
    organization: str = Field(
        ...,
        # No alias needed as field name matches API field name
        description="The name of the organization to retrieve details for",
        min_length=3,
        pattern=r"^[a-z0-9][-a-z0-9_]*[a-z0-9]$",
    )
class OrganizationEntitlementsRequest(APIRequest):
    """Request model for getting organization entitlements.
    This model is used for the GET /organizations/{name}/entitlement-set endpoint.
    The endpoint returns information about which features and capabilities are
    available to the organization based on its subscription tier.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/organizations#show-the-entitlement-set
    See:
        docs/models/organization.md for reference
    """
    organization: str = Field(
        ...,
        # No alias needed as field name matches API field name
        description="The name of the organization to retrieve entitlements for",
        min_length=3,
        pattern=r"^[a-z0-9][-a-z0-9_]*[a-z0-9]$",
    )
class OrganizationDeleteRequest(APIRequest):
    """Request model for deleting an organization.
    This model is used for the DELETE /organizations/{name} endpoint.
    Deleting an organization is a permanent action and cannot be undone.
    All workspaces, configurations, and associated resources will be deleted.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/organizations#delete-an-organization
    Warning:
        This is a destructive operation that cannot be undone. Organization names
        are globally unique and cannot be recreated with the same name later.
    See:
        docs/models/organization.md for reference
    """
    organization: str = Field(
        ...,
        # No alias needed as field name matches API field name
        description="The name of the organization to delete",
        min_length=3,
        pattern=r"^[a-z0-9][-a-z0-9_]*[a-z0-9]$",
    )
class OrganizationListRequest(APIRequest):
    """Request parameters for listing organizations.
    These parameters map to the query parameters in the organizations API.
    The endpoint returns a paginated list of organizations that the authenticated
    user has access to, along with their details.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/organizations#list-organizations
    See:
        docs/models/organization.md for reference
    """
    page_number: Optional[int] = Field(1, ge=1, description="Page number to fetch")
    page_size: Optional[int] = Field(
        20, ge=1, le=100, description="Number of results per page"
    )
    q: Optional[str] = Field(
        None, description="Search query for name and email", max_length=100
    )
    query_email: Optional[str] = Field(
        None, description="Search query for email", max_length=100
    )
    query_name: Optional[str] = Field(
        None, description="Search query for name", max_length=100
    )
class BaseOrganizationRequest(APIRequest):
    """Base class for organization create and update requests with common fields.
    This includes all fields that are commonly used in request payloads for the organization
    creation and update APIs.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/organizations
    Note:
        This class inherits model_config from APIRequest -> BaseModelConfig
    See:
        docs/models/organization.md for fields and usage examples
    """
    # Fields common to both create and update requests with API defaults from docs
    name: Optional[str] = Field(
        None,
        # No alias needed as field name matches API field name
        description="Name of the organization",
        min_length=3,
        pattern=r"^[a-z0-9][-a-z0-9_]*[a-z0-9]$",
    )
    email: Optional[str] = Field(
        None,
        # No alias needed as field name matches API field name
        description="Admin email address",
        pattern=r"^[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$",
    )
    session_timeout: Optional[int] = Field(
        20160,
        alias="session-timeout",
        description="Session timeout after inactivity in minutes",
        ge=1,
        le=43200,  # 30 days in minutes
    )
    session_remember: Optional[int] = Field(
        20160,
        alias="session-remember",
        description="Session expiration in minutes",
        ge=1,
        le=43200,  # 30 days in minutes
    )
    collaborator_auth_policy: Optional[Union[str, CollaboratorAuthPolicy]] = Field(
        CollaboratorAuthPolicy.PASSWORD,
        alias="collaborator-auth-policy",
        description="Authentication policy",
    )
    cost_estimation_enabled: Optional[bool] = Field(
        False,
        alias="cost-estimation-enabled",
        description="Whether cost estimation is enabled for all workspaces",
    )
    send_passing_statuses_for_untriggered_speculative_plans: Optional[bool] = Field(
        False,
        alias="send-passing-statuses-for-untriggered-speculative-plans",
        description="Whether to send VCS status updates for untriggered plans",
    )
    aggregated_commit_status_enabled: Optional[bool] = Field(
        True,
        alias="aggregated-commit-status-enabled",
        description="Whether to aggregate VCS status updates",
    )
    speculative_plan_management_enabled: Optional[bool] = Field(
        True,
        alias="speculative-plan-management-enabled",
        description="Whether to enable automatic cancellation of plan-only runs",
    )
    owners_team_saml_role_id: Optional[str] = Field(
        None,
        alias="owners-team-saml-role-id",
        description="SAML only - the name of the 'owners' team",
    )
    assessments_enforced: Optional[bool] = Field(
        False,
        alias="assessments-enforced",
        description="Whether to compel health assessments for all eligible workspaces",
    )
    allow_force_delete_workspaces: Optional[bool] = Field(
        False,
        alias="allow-force-delete-workspaces",
        description="Whether workspace admins can delete workspaces with resources",
    )
    default_execution_mode: Optional[Union[str, ExecutionMode]] = Field(
        ExecutionMode.REMOTE,
        alias="default-execution-mode",
        description="Default execution mode",
    )
    default_agent_pool_id: Optional[str] = Field(
        None,
        alias="default-agent-pool-id",
        description="The ID of the agent pool (required when default_execution_mode is 'agent')",
    )
class OrganizationCreateRequest(BaseOrganizationRequest):
    """Request model for creating a Terraform Cloud organization.
    Validates and structures the request according to the Terraform Cloud API
    requirements for creating organizations.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/organizations#create-an-organization
    Note:
        This inherits all configuration fields from BaseOrganizationRequest
        while making name and email required.
    See:
        docs/models/organization.md for reference
    """
    # Override name and email to make them required for creation
    name: str = Field(..., description="Name of the organization")
    email: str = Field(..., description="Admin email address")
class OrganizationUpdateRequest(BaseOrganizationRequest):
    """Request model for updating a Terraform Cloud organization.
    Validates and structures the request according to the Terraform Cloud API
    requirements for updating organizations. All fields are optional.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/organizations#update-an-organization
    Note:
        This inherits all configuration fields from BaseOrganizationRequest
        and adds a required organization field for routing.
    See:
        docs/models/organization.md for reference
    """
    # Add organization field which is required for updates but not part of the attributes
    organization: str = Field(
        ...,
        # No alias needed as field name matches API field name
        description="The name of the organization to update",
    )
class OrganizationParams(BaseOrganizationRequest):
    """Parameters for organization operations without routing fields.
    This model provides all optional parameters that can be used when creating or updating
    organizations, reusing the field definitions from BaseOrganizationRequest.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/organizations
    Note:
        All fields are inherited from BaseOrganizationRequest.
    See:
        docs/models/organization.md for reference
    """
    # Inherits model_config and all fields from BaseOrganizationRequest
# Response handling is implemented through raw dictionaries
```
--------------------------------------------------------------------------------
/docs/models/run.md:
--------------------------------------------------------------------------------
```markdown
# Run Models
This document describes the data models used for run operations in Terraform Cloud.
## Overview
Run models provide structure and validation for initiating and managing runs in Terraform Cloud. These models define run configuration options, variables, targeting, and behavior settings. Runs represent the execution of Terraform plans and applies, serving as the primary method for making infrastructure changes.
## Models Reference
### RunVariable
**Type:** Object
**Description:** Defines a variable to be included with a specific run. Run variables override workspace variables for the duration of the run.
**Fields:**
- `key` (string, required): The variable name (max length: 128)
- `value` (string, required): The variable value (max length: 256)
**JSON representation:**
```json
{
  "key": "environment",
  "value": "production"
}
```
**Usage Context:**
Run variables provide a way to override workspace variables temporarily for a specific run without changing the workspace configuration. This is useful for testing changes with different variable values or handling one-off scenarios.
### RunOperation
**Type:** Enum (string)
**Description:** Describes the type of operation being performed by a run.
**Values:**
- `plan`: Standard planning operation
- `plan_destroy`: Planning a destroy operation
- `apply`: Standard apply operation
- `destroy`: Destroying resources operation
**Usage Context:**
Used for filtering runs and determining the type of operation being performed.
### RunSource
**Type:** Enum (string)
**Description:** Indicates the origin that initiated the run.
**Values:**
- `api`: Run was created via the API
- `cli`: Run was created using Terraform CLI
- `ui`: Run was created through the web UI
- `vcs`: Run was triggered by a VCS change
- `run_trigger`: Run was triggered by another workspace's changes
**Usage Context:**
Useful for identifying how a run was created and for filtering runs by their origin.
### RunStatusGroup
**Type:** Enum (string)
**Description:** Categorizes runs into logical groupings based on their current state.
**Values:**
- `pending`: Runs that haven't completed their execution (includes planning, applying)
- `completed`: Runs that have finished execution successfully (planned, applied, etc.)
- `failed`: Runs that have encountered errors or been canceled
- `discarded`: Runs whose changes were not applied
**Usage Context:**
Provides a higher-level categorization of run statuses for filtering and reporting.
### RunParams
**Type:** Object
**Description:** Parameters for run operations. Used to specify configuration options when creating a run.
**Fields:**
- `message` (string, optional): Description of the run's purpose
- `is_destroy` (boolean, optional): Whether to destroy all resources (default: false)
- `auto_apply` (boolean, optional): Whether to auto-apply after successful plan
- `refresh` (boolean, optional): Whether to refresh state before planning (default: true)
- `refresh_only` (boolean, optional): Only refresh state without planning changes (default: false)
- `plan_only` (boolean, optional): Create speculative plan without applying (default: false)
- `allow_empty_apply` (boolean, optional): Allow applies with no changes (default: false)
- `target_addrs` (array, optional): List of resource addresses to target
- `replace_addrs` (array, optional): List of resource addresses to force replacement
- `variables` (array, optional): Run-specific variables that override workspace variables
- `terraform_version` (string, optional): Specific Terraform version for this run
- `save_plan` (boolean, optional): Save the plan for later execution (default: false)
- `debugging_mode` (boolean, optional): Enable extended debug logging (default: false)
- `allow_config_generation` (boolean, optional): Allow generating config for imports (default: false)
**JSON representation:**
```json
{
  "data": {
    "type": "runs",
    "attributes": {
      "message": "Deploy infrastructure changes",
      "plan-only": false,
      "is-destroy": false,
      "refresh": true,
      "auto-apply": false,
      "target-addrs": ["module.network", "aws_instance.web_servers"],
      "replace-addrs": ["module.database.aws_db_instance.main"],
      "variables": [
        {"key": "environment", "value": "production"},
        {"key": "region", "value": "us-west-2"}
      ]
    },
    "relationships": {
      "workspace": {
        "data": {
          "id": "ws-example123",
          "type": "workspaces"
        }
      }
    }
  }
}
```
**Validation Rules:**
- When using `plan_only`, the run will not proceed to apply phase
- If `is_destroy` is true, the run will create a plan to destroy all resources
- `terraform_version` can only be specified for plan-only runs
- `replace_addrs` and `target_addrs` must be valid resource addresses
### RunStatus
**Type:** Enum (string)
**Description:** Represents the possible states of a run during its lifecycle.
**Values:**
- `pending`: Not yet started
- `planning`: Currently running the plan phase
- `planned`: Plan phase completed successfully
- `cost_estimating`: Cost estimate in progress
- `cost_estimated`: Cost estimate completed
- `policy_checking`: Policy check in progress
- `policy_override`: Policy check soft-failed, requires override
- `policy_checked`: Policy check passed
- `confirmed`: Plan confirmed for apply
- `planned_and_finished`: Plan completed, no changes required
- `applying`: Currently applying changes
- `applied`: Apply phase completed successfully
- `discarded`: Discarded and not applied
- `errored`: Encountered an error during the run
- `canceled`: Was canceled during execution
- `force_canceled`: Was forcibly canceled during execution
**Usage Context:**
Used to determine the current phase and state of a run and what actions can be taken on it. For example:
- Only runs in `planned` status can be applied
- Only runs in `planning`, `applying`, or other active states can be canceled
### RunCreateRequest
**Type:** Request Validation Model
**Description:** Used to validate parameters for creating a run.
**Fields:**
- `workspace_id` (string, required): The ID of the workspace to create a run in
- `params` (RunParams, optional): Configuration options for the run
**Used by:**
- `create_run` tool function to validate run creation parameters
### RunDetailsRequest
**Type:** Request Validation Model
**Description:** Used to validate parameters for retrieving run details.
**Fields:**
- `run_id` (string, required): The ID of the run to retrieve details for
  - Format: Must match pattern "run-[a-zA-Z0-9]{16}"
  - Example: "run-AbCdEfGhIjKlMnOp"
**Validation Rules:**
- Run ID must start with "run-" prefix
- Must contain exactly 16 alphanumeric characters after the prefix
### RunActionRequest
**Type:** Request Validation Model
**Description:** Used to validate parameters for performing actions on a run.
**Fields:**
- `run_id` (string, required): The ID of the run to perform an action on
- `comment` (string, optional): An optional explanation for the action
**Used by:**
- `apply_run`, `discard_run`, `cancel_run`, `force_cancel_run` tool functions
### RunListRequest
**Type:** Request Validation Model
**Description:** Used to validate parameters for listing runs in a workspace.
**Fields:**
- `workspace_id` (string, required): The workspace ID to list runs for
- `page_number` (int, optional): Page number to fetch (default: 1)
- `page_size` (int, optional): Number of results per page (default: 20)
- Filter fields for more specific results:
  - `filter_operation`, `filter_status`, `filter_source`, `filter_status_group`
**Used by:**
- `list_runs_in_workspace` tool function to validate listing parameters
## API Response Structure
### Run Details Response
```json
{
  "data": {
    "id": "run-CKJhKBXrxZuc9WLS",
    "type": "runs",
    "attributes": {
      "status": "planned",
      "message": "Deploy infrastructure changes",
      "has-changes": true,
      "created-at": "2023-05-05T15:45:22Z",
      "is-destroy": false,
      "source": "tfe-cli",
      "status-timestamps": {
        "planned-at": "2023-05-05T15:46:02Z"
      }
    },
    "relationships": {
      "workspace": {
        "data": {
          "id": "ws-SihZTyXKfNXUWuUa",
          "type": "workspaces"
        }
      },
      "plan": {
        "data": {
          "id": "plan-j9yycwFSJ2Mn8wuC",
          "type": "plans"
        }
      },
      "cost-estimate": {
        "data": {
          "id": "ce-BPvFFrYCqRV6qVBK",
          "type": "cost-estimates"
        }
      }
    },
    "links": {
      "self": "/api/v2/runs/run-CKJhKBXrxZuc9WLS"
    }
  }
}
```
### List Runs Response
```json
{
  "data": [
    {
      "id": "run-CKJhKBXrxZuc9WLS",
      "type": "runs",
      "attributes": {
        "status": "applied",
        "message": "Weekly infrastructure update",
        "has-changes": true,
        "created-at": "2023-05-05T15:45:22Z",
        "is-destroy": false,
        "source": "tfe-cli"
      }
    },
    {
      "id": "run-DKJhKBXrxZuc9WLA",
      "type": "runs",
      "attributes": {
        "status": "planned",
        "message": "Update database configuration",
        "has-changes": true,
        "created-at": "2023-05-06T09:12:34Z",
        "is-destroy": false,
        "source": "api"
      }
    }
  ],
  "meta": {
    "pagination": {
      "current-page": 1,
      "page-size": 20,
      "prev-page": null,
      "next-page": null,
      "total-pages": 1,
      "total-count": 2
    }
  }
}
```
## Related Resources
- [Run Tools](../tools/run.md)
- [Plan Models](plan.md)
- [Apply Models](apply.md)
- [Cost Estimate Models](cost_estimate.md)
- [Terraform Cloud API - Runs](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/run)
```
--------------------------------------------------------------------------------
/docs/conversations/variables-conversation.md:
--------------------------------------------------------------------------------
```markdown
# Variables Management - Example Conversations
This document shows example conversations with Claude using the Terraform Cloud MCP variable management features.
## Setting Up Workspace Variables
```
User: I need to set up variables for my new web-app workspace. Can you first show me what variables already exist, then help me add some basic ones?
Claude: I'll check the existing variables in your web-app workspace and then help you add the ones you need.
[Claude uses list_workspace_variables tool]
I can see your workspace currently has no variables configured. Let me help you set up some common variables for a web application.
First, I'll add a Terraform variable for the environment:
[Claude uses create_workspace_variable tool]
I've created a Terraform variable called "environment" with the value "development" to help with resource naming.
Now let me add an environment variable for the application port:
[Claude uses create_workspace_variable tool]
I've added an environment variable "APP_PORT" set to "3000" for your application.
Finally, let me add a sensitive variable for database credentials:
[Claude uses create_workspace_variable tool]
I've created a sensitive environment variable "DB_PASSWORD" for your database connection. The value is hidden for security.
Your workspace now has three variables configured:
- **environment** (terraform): "development" - Used for resource naming
- **APP_PORT** (env): "3000" - Application port number  
- **DB_PASSWORD** (env): [hidden] - Database password (sensitive)
User: I need to update the environment variable from "development" to "staging" and make the database password more secure.
Claude: I'll help you update both of those variables. Let me modify the environment variable first, then update the database password.
[Claude uses update_workspace_variable tool]
I've updated the environment variable from "development" to "staging" and updated the description to reflect the change.
[Claude uses update_workspace_variable tool]
I've also updated the database password with a more secure value while keeping it marked as sensitive.
Both variables have been successfully updated:
- **environment**: Now set to "staging"
- **DB_PASSWORD**: Updated with a stronger password (value remains hidden)
```
## Variable Sets Management
```
User: I want to create a variable set for AWS configuration that I can share across multiple workspaces. It should include region, instance type, and some common tags.
Claude: I'll help you create an AWS configuration variable set that can be shared across your workspaces. This will ensure consistent configuration across your infrastructure.
[Claude uses create_variable_set tool]
I've created a variable set called "aws-common-config" with priority enabled, which means these variables will override any workspace-level variables with the same names.
Now let me add the AWS configuration variables to this set:
[Claude uses create_variable_in_variable_set tool]
I've added the AWS region variable set to "us-west-2".
[Claude uses create_variable_in_variable_set tool]
I've added the default instance type as "t3.micro".
[Claude uses create_variable_in_variable_set tool]
I've added common tags as an HCL variable that includes Environment, ManagedBy, and Team tags.
Your AWS configuration variable set is now ready with:
- **aws_region**: "us-west-2"
- **instance_type**: "t3.micro"  
- **common_tags**: HCL object with Environment, ManagedBy, and Team tags
Since this variable set has priority enabled, these variables will take precedence over any workspace-level variables with the same names.
User: I want to assign the AWS variable set to my web-app and api-service workspaces, and also assign it to my entire "production" project.
Claude: I'll assign the AWS configuration variable set to both your specific workspaces and the entire production project. This will ensure consistent AWS configuration across your infrastructure.
[Claude uses assign_variable_set_to_workspaces tool]
I've assigned the AWS variable set to your web-app and api-service workspaces.
[Claude uses assign_variable_set_to_projects tool]
I've also assigned the variable set to your entire production project.
The AWS configuration variable set is now active in:
- **web-app** and **api-service** workspaces directly
- All workspaces within the **production** project
Any new workspaces added to the production project will automatically inherit these AWS variables.
```
## Managing Sensitive Variables
```
User: I need to add some API keys and database credentials to my workspace. How should I handle these sensitive values?
Claude: For sensitive data like API keys and credentials, I'll make sure to mark them as sensitive so their values are hidden in the UI and API responses. This is crucial for security.
Let me add these sensitive variables for you:
[Claude uses create_workspace_variable tool]
I've added your Stripe API secret key as a sensitive environment variable. The value is now hidden from the UI and logs.
[Claude uses create_workspace_variable tool]
I've added the database connection string as a sensitive environment variable as well.
[Claude uses create_workspace_variable tool]
I've also added your OAuth client secret as a sensitive environment variable.
All three sensitive variables have been created:
- **STRIPE_SECRET_KEY** (env, sensitive): Your payment processing key
- **DATABASE_URL** (env, sensitive): Database connection string  
- **OAUTH_CLIENT_SECRET** (env, sensitive): OAuth application secret
These values are now secured and won't be visible in the Terraform Cloud UI or appear in logs. They'll only be available to your Terraform runs as environment variables.
User: How can I see what variables are available across all my variable sets?
Claude: I'll help you get an overview of your variable sets and their contents. Let me start by listing all your variable sets.
[Claude uses list_variable_sets tool]
You have several variable sets configured:
1. **aws-common-config** - Common AWS configuration (priority enabled)
2. **production-secrets** - Production environment secrets (priority enabled)  
3. **development-config** - Development environment variables (global)
Let me show you what variables are in each set:
[Claude uses list_variables_in_variable_set tool]
**aws-common-config** contains:
- aws_region: "us-west-2"
- instance_type: "t3.micro"
- common_tags: HCL object with standard tags
[Claude uses list_variables_in_variable_set tool]
**production-secrets** contains:
- DATABASE_URL: [sensitive] - Production database connection
- API_KEY: [sensitive] - Production API key
- REDIS_URL: [sensitive] - Redis connection string
[Claude uses list_variables_in_variable_set tool]
**development-config** contains:
- DEBUG_MODE: "true"
- LOG_LEVEL: "debug"
- MOCK_SERVICES: "true"
This gives you a complete view of your variable organization across different environments and purposes.
```
## Variable Set Organization Strategy
```
User: What's the best way to organize variables for development, staging, and production environments?
Claude: Let me show you an effective variable organization strategy and help you see how your current setup compares to best practices.
[Claude uses list_variable_sets tool]
Looking at your current variable sets, I can recommend an organization strategy that follows best practices:
**Environment-specific variable sets:**
- Create separate variable sets for each environment (dev, staging, prod)
- Use `global=false` and assign to specific projects/workspaces
- Include environment-specific values like database URLs, API endpoints
**Common configuration variable sets:**
- Shared provider configuration (like your AWS set)
- Common tags and naming conventions  
- Use `priority=true` to ensure consistency
**Application-specific variables:**
- Keep at workspace level for unique configurations
- Use for application-specific secrets and settings
Based on your current setup, you're already following good practices with your "aws-common-config" set. Here's what I'd recommend:
1. **Keep your AWS common config** - It's well structured with priority enabled
2. **Create environment-specific sets** - Separate sets for dev/staging/prod environments
3. **Use project assignments** - Assign environment sets to corresponding projects
4. **Workspace-level for unique values** - Keep application-specific variables at workspace level
This approach provides:
- Clear separation of concerns
- Easy environment promotion  
- Consistent infrastructure configuration
- Secure handling of sensitive data
Would you like me to help you implement any of these organizational improvements?
```
## Variable Cleanup and Maintenance
```
User: I need to clean up some old variables from my AWS variable set and remove a deprecated variable set entirely.
Claude: I'll help you clean up those unused variables and remove the deprecated variable set. Since these are delete operations, they'll permanently remove the data.
Let me start by removing the unused variable from your AWS variable set:
[Claude uses delete_variable_from_variable_set tool]
I've removed the old variable from your AWS variable set. The variable is now gone from all workspaces that use this variable set.
Now let me remove the deprecated variable set entirely:
[Claude uses delete_variable_set tool]
The deprecated variable set has been completely removed. This action:
- Deleted all variables within the set
- Unassigned it from all workspaces and projects  
- Cannot be undone
Your variable management is now cleaned up:
- Unused variable removed from AWS configuration set
- Deprecated variable set completely deleted
- All affected workspaces updated automatically
The cleanup is complete and your variable organization is now more streamlined.
```
```
--------------------------------------------------------------------------------
/docs/models/workspace.md:
--------------------------------------------------------------------------------
```markdown
# Workspace Models
This document describes the data models used for workspace operations in Terraform Cloud.
## Overview
Workspace models provide structure and validation for interacting with the Terraform Cloud Workspaces API. These models define workspace configuration options, VCS repository settings, and execution modes. Workspaces serve as the primary organizational unit in Terraform Cloud, containing Terraform configurations, state files, variables, and run history for managing specific infrastructure resources.
## Models Reference
### ExecutionMode
**Type:** Enum (string)
**Description:** Defines how Terraform operations are executed.
**Values:**
- `remote`: Terraform runs on Terraform Cloud's infrastructure
- `local`: Terraform runs on your local machine
- `agent`: Terraform runs on your own infrastructure using an agent
**JSON representation:**
```json
{
  "execution-mode": "remote"
}
```
**Usage Context:**
The execution mode determines where Terraform commands are run when triggered through the API, UI, or VCS. Different execution modes offer trade-offs between convenience, security, and integration with existing workflows.
**Notes:**
- Default value is `remote`
- When using `agent` mode, you must also specify an `agent-pool-id`
- The `local` mode is typically used for CLI-driven workflows
### VcsRepoConfig
**Type:** Object
**Description:** Version control system repository configuration for a workspace.
**Fields:**
- `branch` (string, optional): The repository branch that Terraform executes from
- `identifier` (string, optional): Repository reference in the format `:org/:repo`
- `ingress_submodules` (boolean, optional): Whether to fetch submodules when cloning
- `oauth_token_id` (string, optional): VCS OAuth connection and token ID
- `github_app_installation_id` (string, optional): GitHub App Installation ID
- `tags_regex` (string, optional): Regex pattern to match Git tags
**JSON representation:**
```json
{
  "vcs-repo": {
    "branch": "main",
    "identifier": "my-org/my-repo",
    "ingress-submodules": true,
    "oauth-token-id": "ot-1234abcd"
  }
}
```
**Usage Context:**
Configuring VCS integration allows Terraform Cloud to automatically queue runs when changes are pushed to the linked repository. The workspace will use the Terraform configuration files from the repository at the specified branch.
### WorkspaceParams
**Type:** Object
**Description:** Parameters for workspace operations without routing fields. Used to specify configuration parameters for creating or updating workspaces.
**Fields:**
- `name` (string, optional): Name of the workspace
- `description` (string, optional): Human-readable description
- `execution_mode` (string, optional): How operations are executed (remote, local, agent)
- `agent_pool_id` (string, optional): Agent pool ID (required when execution_mode=agent)
- `assessments_enabled` (boolean, optional): Whether to perform health assessments
- `auto_apply` (boolean, optional): Whether to auto-apply successful plans from VCS/CLI
- `auto_apply_run_trigger` (boolean, optional): Whether to auto-apply changes from run triggers
- `auto_destroy_at` (string, optional): Timestamp when the next destroy run will occur
- `auto_destroy_activity_duration` (string, optional): Auto-destroy duration based on inactivity
- `file_triggers_enabled` (boolean, optional): Whether to filter runs based on file paths
- `working_directory` (string, optional): Directory to execute commands in
- `speculative_enabled` (boolean, optional): Whether to allow speculative plans
- `terraform_version` (string, optional): Version of Terraform to use (default: latest)
- `global_remote_state` (boolean, optional): Allow all workspaces to access this state
- `vcs_repo` (VcsRepoConfig, optional): VCS repository configuration
- `allow_destroy_plan` (boolean, optional): Allow destruction plans (default: true)
- `queue_all_runs` (boolean, optional): Whether runs should be queued immediately
- `source_name` (string, optional): Where workspace settings originated
- `source_url` (string, optional): URL to origin source
- `trigger_prefixes` (array, optional): List of paths that trigger runs
- `trigger_patterns` (array, optional): List of glob patterns that trigger runs
- `setting_overwrites` (object, optional): Attributes with organization-level defaults
**JSON representation:**
```json
{
  "data": {
    "type": "workspaces",
    "attributes": {
      "name": "my-workspace",
      "description": "Example workspace for terraform configurations",
      "terraform-version": "1.5.0",
      "working-directory": "terraform/",
      "auto-apply": true,
      "file-triggers-enabled": true,
      "trigger-patterns": ["**/*.tf", "*.tfvars"],
      "vcs-repo": {
        "identifier": "organization/repo-name",
        "oauth-token-id": "ot-1234abcd",
        "branch": "main"
      }
    }
  }
}
```
**Validation Rules:**
- Workspace names must be unique within an organization
- When using `agent` execution mode, `agent_pool_id` must be provided
- `terraform_version` must be a valid Terraform version string
- `trigger_patterns` and `trigger_prefixes` can only be used when `file_triggers_enabled` is true
### WorkspaceCreateRequest
**Type:** Request Validation Model
**Description:** Used to validate parameters for creating a workspace.
**Fields:**
- `organization` (string, required): The organization name
- `name` (string, required): The name for the new workspace
- `params` (WorkspaceParams, optional): Additional configuration options
**Used by:**
- `create_workspace` tool function to validate workspace creation parameters
### WorkspaceUpdateRequest
**Type:** Request Validation Model
**Description:** Used to validate parameters for updating a workspace.
**Fields:**
- `organization` (string, required): The organization name
- `workspace_name` (string, required): The name of the workspace to update
- `params` (WorkspaceParams, optional): Settings to update
**Used by:**
- `update_workspace` tool function to validate workspace update parameters
### WorkspaceListRequest
**Type:** Request Validation Model
**Description:** Parameters for listing workspaces in an organization.
**Fields:**
- `organization` (string, required): The name of the organization to list workspaces from
- `page_number` (integer, optional): Page number to fetch (default: 1)
- `page_size` (integer, optional): Number of results per page (default: 20, max: 100)
- `search` (string, optional): Substring to search for in workspace names
**Usage Context:**
Used to retrieve a paginated list of workspaces in an organization, with optional filtering by name search.
### WorkspaceLockRequest
**Type:** Request Validation Model
**Description:** Used to validate parameters for locking a workspace.
**Fields:**
- `workspace_id` (string, required): The ID of the workspace to lock
- `reason` (string, optional): Explanation for locking the workspace
**Used by:**
- `lock_workspace` tool function to validate workspace locking parameters
### DataRetentionPolicyRequest
**Type:** Request Validation Model
**Description:** Parameters for setting a data retention policy on a workspace.
**Fields:**
- `workspace_id` (string, required): The ID of the workspace to set the policy for
- `days` (integer, required): Number of days to retain data
**JSON representation:**
```json
{
  "data": {
    "type": "workspace-settings",
    "attributes": {
      "data-retention-days": 30
    }
  }
}
```
**Usage Context:**
Data retention policies control how long Terraform Cloud retains state versions and run history for a workspace, helping manage storage usage and comply with data policies.
## API Response Structure
### Workspace Details Response
```json
{
  "data": {
    "id": "ws-SihZTyXKfNXUWuUa",
    "type": "workspaces",
    "attributes": {
      "name": "my-workspace",
      "description": "Example workspace for terraform configurations",
      "auto-apply": true,
      "created-at": "2023-05-01T12:34:56Z",
      "environment": "default",
      "execution-mode": "remote",
      "file-triggers-enabled": true,
      "global-remote-state": false,
      "locked": false,
      "terraform-version": "1.5.0",
      "trigger-patterns": ["**/*.tf", "*.tfvars"],
      "working-directory": "terraform/"
    },
    "relationships": {
      "organization": {
        "data": {
          "id": "org-ABcd1234",
          "type": "organizations"
        }
      },
      "current-run": {
        "data": null
      },
      "latest-run": {
        "data": {
          "id": "run-CKJhKBXrxZuc9WLS",
          "type": "runs"
        }
      }
    },
    "links": {
      "self": "/api/v2/organizations/my-org/workspaces/my-workspace"
    }
  }
}
```
### List Workspaces Response
```json
{
  "data": [
    {
      "id": "ws-SihZTyXKfNXUWuUa",
      "type": "workspaces",
      "attributes": {
        "name": "production",
        "description": "Production infrastructure",
        "terraform-version": "1.5.0",
        "created-at": "2023-05-01T12:34:56Z",
        "execution-mode": "remote"
      },
      "relationships": {
        "organization": {
          "data": {
            "id": "org-ABcd1234",
            "type": "organizations"
          }
        },
        "current-state-version": {
          "data": {
            "id": "sv-12345abcde",
            "type": "state-versions"
          }
        }
      }
    },
    {
      "id": "ws-TjqZUyYLfOXWhwbC",
      "type": "workspaces",
      "attributes": {
        "name": "staging",
        "description": "Staging infrastructure",
        "terraform-version": "1.5.0",
        "created-at": "2023-05-02T09:12:34Z",
        "execution-mode": "remote"
      },
      "relationships": {
        "organization": {
          "data": {
            "id": "org-ABcd1234",
            "type": "organizations"
          }
        }
      }
    }
  ],
  "meta": {
    "pagination": {
      "current-page": 1,
      "page-size": 20,
      "prev-page": null,
      "next-page": null,
      "total-pages": 1,
      "total-count": 2
    }
  }
}
```
## Related Resources
- [Workspace Tools](../tools/workspace.md)
- [Organization Models](organization.md)
- [Run Models](run.md)
- [Project Models](project.md)
- [Terraform Cloud API - Workspaces](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/workspaces)
```
--------------------------------------------------------------------------------
/terraform_cloud_mcp/models/workspaces.py:
--------------------------------------------------------------------------------
```python
"""Workspace models for Terraform Cloud API
This module contains models for Terraform Cloud workspace-related requests.
Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/workspaces
"""
from typing import Dict, List, Optional, Union
from pydantic import Field
from .base import APIRequest, ExecutionMode
class VcsRepoConfig(APIRequest):
    """VCS repository configuration for a workspace.
    Defines version control system repository configuration for a workspace,
    including branch, repository identifier, OAuth token, and other settings.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/workspaces
    See:
        docs/models/workspace.md for reference
    """
    # Inherits model_config from APIRequest -> BaseModelConfig
    branch: Optional[str] = Field(
        None, description="The repository branch that Terraform executes from"
    )
    identifier: Optional[str] = Field(
        None, description="A reference to your VCS repository in the format :org/:repo"
    )
    ingress_submodules: Optional[bool] = Field(
        None,
        alias="ingress-submodules",
        description="Whether submodules should be fetched when cloning the VCS repository",
    )
    oauth_token_id: Optional[str] = Field(
        None,
        alias="oauth-token-id",
        description="Specifies the VCS OAuth connection and token",
    )
    github_app_installation_id: Optional[str] = Field(
        None,
        alias="github-app-installation-id",
        description="The VCS Connection GitHub App Installation to use",
    )
    tags_regex: Optional[str] = Field(
        None,
        alias="tags-regex",
        description="A regular expression used to match Git tags",
    )
class WorkspaceListRequest(APIRequest):
    """Request parameters for listing workspaces in an organization.
    Defines the parameters for the workspace listing API including pagination
    and search filtering options.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/workspaces#list-workspaces
    See:
        docs/models/workspace.md for reference
    """
    organization: str = Field(
        ...,
        # No alias needed as field name matches API field name
        description="The name of the organization to list workspaces from",
    )
    page_number: Optional[int] = Field(1, ge=1, description="Page number to fetch")
    page_size: Optional[int] = Field(
        20, ge=1, le=100, description="Number of results per page"
    )
    search: Optional[str] = Field(None, description="Substring to search for")
class BaseWorkspaceRequest(APIRequest):
    """Base class for workspace create and update requests with common fields.
    This includes common fields used in request payloads for workspace
    creation and update APIs, providing a foundation for more specific workspace models.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/workspaces
    Note:
        This class inherits model_config from APIRequest -> BaseModelConfig and provides
        default values for most fields based on Terraform Cloud API defaults.
    See:
        docs/models/workspace.md for detailed field descriptions and usage examples
    """
    # Fields common to both create and update requests with API defaults from docs
    name: Optional[str] = Field(
        None,
        # No alias needed as field name matches API field name
        description="Name of the workspace",
    )
    description: Optional[str] = Field(
        None,
        # No alias needed as field name matches API field name
        description="Description of the workspace",
    )
    execution_mode: Optional[Union[str, ExecutionMode]] = Field(
        ExecutionMode.REMOTE,
        alias="execution-mode",
        description="How operations are executed",
    )
    agent_pool_id: Optional[str] = Field(
        None, alias="agent-pool-id", description="The ID of the agent pool"
    )
    assessments_enabled: Optional[bool] = Field(
        False,
        alias="assessments-enabled",
        description="Whether to perform health assessments",
    )
    auto_apply: Optional[bool] = Field(
        False,
        alias="auto-apply",
        description="Whether to automatically apply changes in runs triggered by VCS, UI, or CLI",
    )
    auto_apply_run_trigger: Optional[bool] = Field(
        False,
        alias="auto-apply-run-trigger",
        description="Whether to automatically apply changes initiated by run triggers",
    )
    auto_destroy_at: Optional[str] = Field(
        None,
        alias="auto-destroy-at",
        description="Timestamp when the next scheduled destroy run will occur",
    )
    auto_destroy_activity_duration: Optional[str] = Field(
        None,
        alias="auto-destroy-activity-duration",
        description="Value and units for automatically scheduled destroy runs based on workspace activity",
    )
    file_triggers_enabled: Optional[bool] = Field(
        True,
        alias="file-triggers-enabled",
        description="Whether to filter runs based on file paths",
    )
    working_directory: Optional[str] = Field(
        None,
        alias="working-directory",
        description="The directory to execute commands in",
    )
    speculative_enabled: Optional[bool] = Field(
        True,
        alias="speculative-enabled",
        description="Whether this workspace allows speculative plans",
    )
    terraform_version: Optional[str] = Field(
        "latest",
        alias="terraform-version",
        description="Specifies the version of Terraform to use for this workspace",
    )
    global_remote_state: Optional[bool] = Field(
        False,
        alias="global-remote-state",
        description="Whether to allow all workspaces to access this workspace's state",
    )
    vcs_repo: Optional[Union[VcsRepoConfig, None]] = Field(
        None,
        alias="vcs-repo",
        description="Settings for the workspace's VCS repository",
    )
    allow_destroy_plan: Optional[bool] = Field(
        True,
        alias="allow-destroy-plan",
        description="Whether to allow destruction plans",
    )
    queue_all_runs: Optional[bool] = Field(
        False,
        alias="queue-all-runs",
        description="Whether runs should be queued immediately",
    )
    source_name: Optional[str] = Field(
        None,
        alias="source-name",
        description="Indicates where the workspace settings originated",
    )
    source_url: Optional[str] = Field(
        None, alias="source-url", description="URL to origin source"
    )
    trigger_prefixes: Optional[List[str]] = Field(
        None, alias="trigger-prefixes", description="List of paths that trigger runs"
    )
    trigger_patterns: Optional[List[str]] = Field(
        None,
        alias="trigger-patterns",
        description="List of glob patterns that trigger runs",
    )
    setting_overwrites: Optional[Dict[str, bool]] = Field(
        None,
        alias="setting-overwrites",
        description="Specifies attributes that have organization-level defaults",
    )
class WorkspaceCreateRequest(BaseWorkspaceRequest):
    """Request model for creating a Terraform Cloud workspace.
    Validates and structures the request according to the Terraform Cloud API
    requirements for creating workspaces. Extends BaseWorkspaceRequest with
    required fields for creation.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/workspaces#create-a-workspace
    Note:
        This inherits all configuration fields from BaseWorkspaceRequest
        while making organization and name required.
    See:
        docs/models/workspace.md for reference
    """
    # Override organization and name to make them required for creation
    organization: str = Field(
        ...,
        # No alias needed as field name matches API field name
        description="The name of the organization to create the workspace in",
    )
    name: str = Field(
        ...,
        # No alias needed as field name matches API field name
        description="Name of the workspace",
    )
class WorkspaceUpdateRequest(BaseWorkspaceRequest):
    """Request model for updating a Terraform Cloud workspace.
    Validates and structures the request for updating workspaces. Extends BaseWorkspaceRequest
    with routing fields while keeping all configuration fields optional.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/workspaces#update-a-workspace
    Note:
        This inherits all configuration fields from BaseWorkspaceRequest
        and adds required routing fields for the update operation.
    See:
        docs/models/workspace.md for reference
    """
    # Add fields which are required for updates but not part of the workspace attributes payload
    organization: str = Field(
        ...,
        # No alias needed as field name matches API field name
        description="The name of the organization that owns the workspace",
    )
    workspace_name: str = Field(
        ...,
        # No alias needed as field name matches API field name
        description="The name of the workspace to update",
    )
class WorkspaceParams(BaseWorkspaceRequest):
    """Parameters for workspace operations without routing fields.
    This model provides all optional parameters for creating or updating workspaces,
    reusing field definitions from BaseWorkspaceRequest. It separates configuration
    parameters from routing information like organization and workspace name.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/workspaces
    Note:
        When updating a workspace, use this model to specify only the attributes
        you want to change. Unspecified attributes retain their current values.
        All fields are inherited from BaseWorkspaceRequest.
    See:
        docs/models/workspace.md for reference
    """
    # Inherits model_config and all fields from BaseWorkspaceRequest
# Response handling is implemented through raw dictionaries
class DataRetentionPolicyRequest(APIRequest):
    """Request model for setting a data retention policy.
    Defines the parameters for the data retention policy API to specify
    how long Terraform Cloud should keep run history and state files.
    Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/workspaces#create-a-data-retention-policy
    See:
        docs/models/workspace.md for reference
    """
    workspace_id: str = Field(
        ..., description="The ID of the workspace to set the policy for"
    )
    days: int = Field(..., description="Number of days to retain data")
```
--------------------------------------------------------------------------------
/docs/models/variables.md:
--------------------------------------------------------------------------------
```markdown
# Variables Models
This document describes the data models used for variable and variable set operations in Terraform Cloud.
## Overview
Variable models provide structure and validation for interacting with the Terraform Cloud Variables API. These models define workspace variable configuration, variable set management, and cross-workspace variable sharing. Variables serve as input parameters for Terraform configurations and environment settings for runs, while variable sets enable reusable variable collections across multiple workspaces and projects.
## Models Reference
### VariableCategory
**Type:** Enum (string)
**Description:** Defines the type of variable for use in Terraform operations.
**Values:**
- `terraform`: Terraform input variables available in configuration
- `env`: Environment variables available during plan/apply operations
**JSON representation:**
```json
{
  "category": "terraform"
}
```
**Usage Context:**
The variable category determines how the variable is exposed during Terraform operations. Terraform variables are available as input variables in the configuration, while environment variables are available to the Terraform process and any custom scripts.
**Notes:**
- HCL format is only valid for terraform category variables
- Environment variables are typically used for provider authentication or configuration
### WorkspaceVariable
**Type:** Object
**Description:** Complete model for workspace variable data including all configuration options.
**Fields:**
- `key` (string, required): Variable name/key (1-255 characters)
- `value` (string, optional): Variable value (max 256,000 characters)
- `description` (string, optional): Description of the variable (max 512 characters)
- `category` (VariableCategory, required): Variable category (terraform or env)
- `hcl` (boolean, optional): Whether the value is HCL code (default: false, terraform only)
- `sensitive` (boolean, optional): Whether the variable value is sensitive (default: false)
**JSON representation:**
```json
{
  "data": {
    "type": "vars",
    "attributes": {
      "key": "instance_type",
      "value": "t3.micro",
      "description": "EC2 instance type for the web server",
      "category": "terraform",
      "hcl": false,
      "sensitive": false
    }
  }
}
```
**Usage Context:**
Workspace variables provide configuration values specific to a single workspace. They can be Terraform input variables or environment variables, with support for sensitive values and HCL syntax for complex data structures.
### VariableSet
**Type:** Object
**Description:** Configuration model for variable sets that can be shared across workspaces and projects.
**Fields:**
- `name` (string, required): Variable set name (1-90 characters)
- `description` (string, optional): Description of the variable set (max 512 characters)
- `global` (boolean, optional): Whether this is a global variable set (default: false)
- `priority` (boolean, optional): Whether this variable set takes priority over workspace variables (default: false)
**JSON representation:**
```json
{
  "data": {
    "type": "varsets",
    "attributes": {
      "name": "production-env",
      "description": "Common variables for production environment",
      "global": false,
      "priority": true
    }
  }
}
```
**Usage Context:**
Variable sets enable sharing common variables across multiple workspaces and projects. Global variable sets are automatically applied to all workspaces, while priority variable sets override workspace-level variables with the same keys.
### WorkspaceVariableParams
**Type:** Object
**Description:** Parameters for workspace variable operations without routing fields. Used to specify configuration parameters for creating or updating workspace variables.
**Fields:**
- `key` (string, optional): Variable name/key
- `value` (string, optional): Variable value
- `description` (string, optional): Description of the variable
- `category` (VariableCategory, optional): Variable category (terraform or env)
- `hcl` (boolean, optional): Whether the value is HCL code (terraform variables only)
- `sensitive` (boolean, optional): Whether the variable value is sensitive
**Usage Context:**
Used as optional parameters in variable creation and update operations, allowing specification of only the fields that need to be set or changed.
### VariableSetParams
**Type:** Object
**Description:** Parameters for variable set operations without routing fields. Used to specify configuration parameters for creating or updating variable sets.
**Fields:**
- `name` (string, optional): Variable set name
- `description` (string, optional): Description of the variable set
- `global` (boolean, optional): Whether this is a global variable set
- `priority` (boolean, optional): Whether this variable set takes priority over workspace variables
**Usage Context:**
Used as optional parameters in variable set creation and update operations, enabling modification of only specified attributes while leaving others unchanged.
### WorkspaceVariableCreateRequest
**Type:** Request Validation Model
**Description:** Used to validate parameters for creating workspace variables.
**Fields:**
- `workspace_id` (string, required): The workspace ID (format: "ws-xxxxxxxx")
- `key` (string, required): Variable name/key
- `category` (VariableCategory, required): Variable category
- `params` (WorkspaceVariableParams, optional): Additional variable parameters
**Used by:**
- `create_workspace_variable` tool function to validate variable creation parameters
### WorkspaceVariableUpdateRequest
**Type:** Request Validation Model
**Description:** Used to validate parameters for updating workspace variables.
**Fields:**
- `workspace_id` (string, required): The workspace ID (format: "ws-xxxxxxxx")
- `variable_id` (string, required): The variable ID (format: "var-xxxxxxxx")
- `params` (WorkspaceVariableParams, optional): Variable parameters to update
**Used by:**
- `update_workspace_variable` tool function to validate variable update parameters
### VariableSetCreateRequest
**Type:** Request Validation Model
**Description:** Used to validate parameters for creating variable sets.
**Fields:**
- `organization` (string, required): The organization name
- `name` (string, required): Variable set name
- `params` (VariableSetParams, optional): Additional variable set parameters
**Used by:**
- `create_variable_set` tool function to validate variable set creation parameters
### VariableSetUpdateRequest
**Type:** Request Validation Model
**Description:** Used to validate parameters for updating variable sets.
**Fields:**
- `varset_id` (string, required): The variable set ID (format: "varset-xxxxxxxx")
- `params` (VariableSetParams, optional): Variable set parameters to update
**Used by:**
- `update_variable_set` tool function to validate variable set update parameters
### VariableSetAssignmentRequest
**Type:** Request Validation Model
**Description:** Used to validate parameters for assigning variable sets to workspaces or projects.
**Fields:**
- `varset_id` (string, required): The variable set ID (format: "varset-xxxxxxxx")
- `workspace_ids` (List[str], optional): List of workspace IDs to assign
- `project_ids` (List[str], optional): List of project IDs to assign
**Used by:**
- `assign_variable_set_to_workspaces` and `assign_variable_set_to_projects` tool functions
## API Response Structure
### Workspace Variable Response
```json
{
  "data": {
    "id": "var-EavQ1LztoRTQHSNT",
    "type": "vars",
    "attributes": {
      "key": "image_id",
      "value": "ami-0c55b159cbfafe1d0",
      "description": "AMI ID for the EC2 instance",
      "category": "terraform",
      "hcl": false,
      "sensitive": false,
      "created-at": "2023-05-01T12:34:56Z",
      "updated-at": "2023-05-01T12:34:56Z"
    },
    "relationships": {
      "configurable": {
        "data": {
          "id": "ws-SihZTyXKfNXUWuUa",
          "type": "workspaces"
        }
      }
    }
  }
}
```
### List Workspace Variables Response
```json
{
  "data": [
    {
      "id": "var-EavQ1LztoRTQHSNT",
      "type": "vars",
      "attributes": {
        "key": "image_id",
        "value": "ami-0c55b159cbfafe1d0",
        "description": "AMI ID for the EC2 instance",
        "category": "terraform",
        "hcl": false,
        "sensitive": false
      }
    },
    {
      "id": "var-GjyK2MzupSTRJTOU",
      "type": "vars",
      "attributes": {
        "key": "DATABASE_URL",
        "value": null,
        "description": "Database connection string",
        "category": "env",
        "hcl": false,
        "sensitive": true
      }
    }
  ]
}
```
### Variable Set Response
```json
{
  "data": {
    "id": "varset-47qC5ydmtMa8cYs2",
    "type": "varsets",
    "attributes": {
      "name": "aws-common-config",
      "description": "Common AWS configuration variables",
      "global": false,
      "priority": true,
      "created-at": "2023-05-01T12:34:56Z",
      "updated-at": "2023-05-01T12:34:56Z"
    },
    "relationships": {
      "organization": {
        "data": {
          "id": "org-ABcd1234",
          "type": "organizations"
        }
      },
      "vars": {
        "data": [
          {
            "id": "var-EavQ1LztoRTQHSNT",
            "type": "vars"
          }
        ]
      },
      "workspaces": {
        "data": [
          {
            "id": "ws-SihZTyXKfNXUWuUa",
            "type": "workspaces"
          }
        ]
      }
    }
  }
}
```
### List Variable Sets Response
```json
{
  "data": [
    {
      "id": "varset-47qC5ydmtMa8cYs2",
      "type": "varsets",
      "attributes": {
        "name": "production-config",
        "description": "Production environment variables",
        "global": false,
        "priority": true,
        "created-at": "2023-05-01T12:34:56Z"
      },
      "relationships": {
        "organization": {
          "data": {
            "id": "org-ABcd1234",
            "type": "organizations"
          }
        }
      }
    },
    {
      "id": "varset-58rD6zemvNb9dZt3",
      "type": "varsets",
      "attributes": {
        "name": "global-config",
        "description": "Global configuration variables",
        "global": true,
        "priority": false,
        "created-at": "2023-05-02T09:12:34Z"
      }
    }
  ],
  "meta": {
    "pagination": {
      "current-page": 1,
      "page-size": 20,
      "prev-page": null,
      "next-page": null,
      "total-pages": 1,
      "total-count": 2
    }
  }
}
```
## Related Resources
- [Variable Tools](../tools/variables.md)
- [Workspace Models](workspace.md)
- [Project Models](project.md)
- [Terraform Cloud API - Workspace Variables](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/workspace-variables)
- [Terraform Cloud API - Variable Sets](https://developer.hashicorp.com/terraform/cloud-docs/api-docs/variable-sets)
```
--------------------------------------------------------------------------------
/terraform_cloud_mcp/tools/projects.py:
--------------------------------------------------------------------------------
```python
"""Project management tools for Terraform Cloud MCP
This module implements the project-related endpoints of the Terraform Cloud API.
Reference: https://developer.hashicorp.com/terraform/cloud-docs/api-docs/projects
"""
import logging
from typing import Dict, List, Optional
from ..api.client import api_request
from ..utils.decorators import handle_api_errors
from ..utils.payload import create_api_payload
from ..utils.request import query_params
from ..models.base import APIResponse
from ..models.projects import (
    ProjectCreateRequest,
    ProjectUpdateRequest,
    ProjectListRequest,
    ProjectParams,
    ProjectTagBindingRequest,
    TagBinding,
    WorkspaceMoveRequest,
)
@handle_api_errors
async def create_project(
    organization: str, name: str, params: Optional[ProjectParams] = None
) -> APIResponse:
    """Create a new project in an organization.
    Creates a new Terraform Cloud project which serves as a container for workspaces.
    Projects help organize workspaces into logical groups and can have their own
    settings and permissions.
    API endpoint: POST /organizations/{organization}/projects
    Args:
        organization: The name of the organization
        name: The name to give the project
        params: Additional project parameters (optional):
            - description: Human-readable description of the project
            - auto_destroy_activity_duration: How long each workspace should wait before auto-destroying
              (e.g., '14d', '24h')
            - tag_bindings: List of tag key-value pairs to bind to the project
    Returns:
        The created project data including configuration, settings and metadata
    See:
        docs/tools/project.md for reference documentation
    """
    param_dict = params.model_dump(exclude_none=True) if params else {}
    request = ProjectCreateRequest(organization=organization, name=name, **param_dict)
    # Create the base payload
    payload = create_api_payload(
        resource_type="projects", model=request, exclude_fields={"organization"}
    )
    # Handle tag bindings if present
    if request.tag_bindings:
        tag_bindings_data = []
        for tag in request.tag_bindings:
            tag_bindings_data.append(
                {
                    "type": "tag-bindings",
                    "attributes": {"key": tag.key, "value": tag.value},
                }
            )
        if "relationships" not in payload["data"]:
            payload["data"]["relationships"] = {}
        payload["data"]["relationships"]["tag-bindings"] = {"data": tag_bindings_data}
    # Remove tag-bindings from attributes if present since we've moved them to relationships
    if "tag-bindings" in payload["data"]["attributes"]:
        del payload["data"]["attributes"]["tag-bindings"]
    logger = logging.getLogger(__name__)
    logger.debug(f"Create project payload: {payload}")
    return await api_request(
        f"organizations/{organization}/projects", method="POST", data=payload
    )
@handle_api_errors
async def update_project(
    project_id: str, params: Optional[ProjectParams] = None
) -> APIResponse:
    """Update an existing project.
    Modifies the settings of a Terraform Cloud project. This can be used to change
    attributes like name, description, auto-destroy duration, or tags. Only specified
    attributes will be updated; unspecified attributes remain unchanged.
    API endpoint: PATCH /projects/{project_id}
    Args:
        project_id: The ID of the project to update (format: "prj-xxxxxxxx")
        params: Project parameters to update (optional):
            - name: New name for the project
            - description: Human-readable description of the project
            - auto_destroy_activity_duration: How long each workspace should wait before
              auto-destroying (e.g., '14d', '24h')
            - tag_bindings: List of tag key-value pairs to bind to the project
    Returns:
        The updated project with all current settings and configuration
    See:
        docs/tools/project.md for reference documentation
    """
    # Extract parameters from the params object if provided
    param_dict = params.model_dump(exclude_none=True) if params else {}
    # Create request using Pydantic model
    request = ProjectUpdateRequest(project_id=project_id, **param_dict)
    # Create base API payload using utility function
    payload = create_api_payload(
        resource_type="projects",
        model=request,
        exclude_fields={"project_id"},
    )
    # Handle tag bindings if present
    if request.tag_bindings:
        tag_bindings_data = []
        for tag in request.tag_bindings:
            tag_bindings_data.append(
                {
                    "type": "tag-bindings",
                    "attributes": {"key": tag.key, "value": tag.value},
                }
            )
        if "relationships" not in payload["data"]:
            payload["data"]["relationships"] = {}
        payload["data"]["relationships"]["tag-bindings"] = {"data": tag_bindings_data}
    # Remove tag-bindings from attributes if present since we've moved them to relationships
    if "tag-bindings" in payload["data"]["attributes"]:
        del payload["data"]["attributes"]["tag-bindings"]
    # Log payload for debugging
    logger = logging.getLogger(__name__)
    logger.debug(f"Update project payload: {payload}")
    # Make API request
    return await api_request(f"projects/{project_id}", method="PATCH", data=payload)
@handle_api_errors
async def list_projects(
    organization: str,
    page_number: int = 1,
    page_size: int = 20,
    q: Optional[str] = None,
    filter_names: Optional[str] = None,
    filter_permissions_update: Optional[bool] = None,
    filter_permissions_create_workspace: Optional[bool] = None,
    sort: Optional[str] = None,
) -> APIResponse:
    """List projects in an organization.
    Retrieves a paginated list of all projects in a Terraform Cloud organization.
    Results can be filtered using a search string or permissions filters to find
    specific projects.
    API endpoint: GET /organizations/{organization}/projects
    Args:
        organization: The name of the organization to list projects from
        page_number: The page number to return (default: 1)
        page_size: The number of items per page (default: 20, max: 100)
        q: Optional search query to filter projects by name
        filter_names: Filter projects by name (comma-separated list)
        filter_permissions_update: Filter projects that the user can update
        filter_permissions_create_workspace: Filter projects that the user can create workspaces in
        sort: Sort projects by name ('name' or '-name' for descending)
    Returns:
        Paginated list of projects with their configuration settings and metadata
    See:
        docs/tools/project.md for reference documentation
    """
    # Create request using Pydantic model for validation
    request = ProjectListRequest(
        organization=organization,
        page_number=page_number,
        page_size=page_size,
        q=q,
        filter_names=filter_names,
        filter_permissions_update=filter_permissions_update,
        filter_permissions_create_workspace=filter_permissions_create_workspace,
        sort=sort,
    )
    # Use the unified query params utility function
    params = query_params(request)
    # Make API request
    return await api_request(
        f"organizations/{organization}/projects", method="GET", params=params
    )
@handle_api_errors
async def get_project_details(project_id: str) -> APIResponse:
    """Get details for a specific project.
    Retrieves comprehensive information about a project including its configuration,
    tag bindings, workspace count, and other attributes.
    API endpoint: GET /projects/{project_id}
    Args:
        project_id: The ID of the project (format: "prj-xxxxxxxx")
    Returns:
        Project details including settings, configuration and status
    See:
        docs/tools/project.md for reference documentation
    """
    # Make API request
    return await api_request(f"projects/{project_id}", method="GET")
@handle_api_errors
async def delete_project(project_id: str) -> APIResponse:
    """Delete a project.
    Permanently deletes a Terraform Cloud project. This operation will
    fail if the project contains any workspaces or stacks.
    API endpoint: DELETE /projects/{project_id}
    Args:
        project_id: The ID of the project to delete (format: "prj-xxxxxxxx")
    Returns:
        Empty response with HTTP 204 status code if successful
    See:
        docs/tools/project.md for reference documentation
    """
    # Make API request
    return await api_request(f"projects/{project_id}", method="DELETE")
@handle_api_errors
async def list_project_tag_bindings(project_id: str) -> APIResponse:
    """List tag bindings for a project.
    Retrieves the list of tags bound to a specific project. These tags are
    inherited by all workspaces within the project.
    API endpoint: GET /projects/{project_id}/tag-bindings
    Args:
        project_id: The ID of the project (format: "prj-xxxxxxxx")
    Returns:
        List of tag bindings with their key-value pairs and creation timestamps
    See:
        docs/tools/project.md for reference documentation
    """
    # Make API request
    return await api_request(f"projects/{project_id}/tag-bindings", method="GET")
@handle_api_errors
async def add_update_project_tag_bindings(
    project_id: str, tag_bindings: List[TagBinding]
) -> APIResponse:
    """Add or update tag bindings on a project.
    Adds new tag bindings or updates existing ones on a project. This is an
    additive operation that doesn't remove existing tags.
    API endpoint: PATCH /projects/{project_id}/tag-bindings
    Args:
        project_id: The ID of the project (format: "prj-xxxxxxxx")
        tag_bindings: List of TagBinding objects with key-value pairs to add or update
    Returns:
        The complete list of updated tag bindings for the project
    See:
        docs/tools/project.md for reference documentation
    """
    # Create request using Pydantic model
    request = ProjectTagBindingRequest(project_id=project_id, tag_bindings=tag_bindings)
    # Create payload
    tag_bindings_data = []
    for tag in request.tag_bindings:
        tag_bindings_data.append(
            {"type": "tag-bindings", "attributes": {"key": tag.key, "value": tag.value}}
        )
    payload = {"data": tag_bindings_data}
    # Make API request
    return await api_request(
        f"projects/{project_id}/tag-bindings", method="PATCH", data=payload
    )
@handle_api_errors
async def move_workspaces_to_project(
    project_id: str, workspace_ids: List[str]
) -> APIResponse:
    """Move workspaces into a project.
    Moves one or more workspaces into a project. The user must have permission
    to move workspaces on both source and destination projects.
    API endpoint: POST /projects/{project_id}/relationships/workspaces
    Args:
        project_id: The ID of the destination project (format: "prj-xxxxxxxx")
        workspace_ids: List of workspace IDs to move (format: ["ws-xxxxxxxx", ...])
    Returns:
        Empty response with HTTP 204 status code if successful
    See:
        docs/tools/project.md for reference documentation
    """
    # Create request using Pydantic model
    request = WorkspaceMoveRequest(project_id=project_id, workspace_ids=workspace_ids)
    # Create payload
    payload: Dict[str, List[Dict[str, str]]] = {"data": []}
    for workspace_id in request.workspace_ids:
        payload["data"].append({"type": "workspaces", "id": workspace_id})
    # Make API request
    return await api_request(
        f"projects/{project_id}/relationships/workspaces", method="POST", data=payload
    )
```