# Takk > Takk is a PaaS/SaaS service that simplifies Python deployment. Define your entire application stack in pure Python - web services, background workers, databases, and infrastructure - then deploy with a single command. ## Overview Takk eliminates the need for Docker Compose, Terraform, and CI/CD configuration files. Instead, you define everything in a single, type-safe Python `Project` class. Takk handles containerization, networking, secrets management, database provisioning, and TLS certificates automatically. The required resources are derived from a pydantic `BaseModel` or `BaseSettings` class, which is added to the `Project` definition. Resources will not be provisioned if they exist in a `.env` or as env vars. ## Assumptions Takk assumes that Docker is installed and running. ## Key Features - **Single Command Deployment**: Run `takk up` locally, `takk deploy` for cloud - **Automatic Infrastructure**: Databases, queues, and storage provisioned from code analysis - **Type-Safe Configuration**: Python Project class with full IDE support - **Secret Management**: Secure injection of credentials per-component - **Multiple App Types**: FastAPI, Streamlit, MLflow, background workers, scheduled jobs ## Documentation Sections ### Core Concepts - [Getting Started Guide](https://docs.takk.cloud/llms/get_started.md) - Installation and first project - [Project Definition](https://docs.takk.cloud/llms/project_definition.md) - The central configuration object - [Secrets Management](https://docs.takk.cloud/llms/secrets.md) - Secure credential handling - [Configure Resources](https://docs.takk.cloud/llms/resources.md) - Configure resources like Psql, Redis, MongoDB, etc. ### Application Types - [Network Applications](https://docs.takk.cloud/llms/network_apps.md) - Define Network applications like, FastAPI, Streamlit, MLflow servers - [Background Jobs](https://docs.takk.cloud/llms/jobs.md) - Scheduled, cron and one-time tasks - [Queues and Workers](https://docs.takk.cloud/llms/queues.md) - Message-driven processing - [Pub/Sub](https://docs.takk.cloud/llms/pubsub.md) - Event-driven messaging ### Development & Deployment - [Local Development](https://docs.takk.cloud/llms/local.md) - Running locally with Docker - [Deployment Guide](https://docs.takk.cloud/llms/deploy.md) - Deploying to cloud environments ### Examples - [Support Ticket Classification](https://docs.takk.cloud/llms/support_ticket.md) - AI-powered ticket routing example ### Troubleshooting - [FAQs](https://docs.takk.cloud/llms/troubleshooting.md) - Common questions and solutions ## Quick Start Example Install through `uv` ```bash uv add takk ``` ```python from pydantic import PostgresDsn, SecretStr from pydantic_settings import BaseSettings from takk import Project, NetworkApp, Job, value_for from takk.secrets import AiToken, AiBaseAPI, AiBaseUrl from takk.resources import ServerlessPostgresInstance from app.jobs import materialize_job, MaterializeArgs class AppSettings(BaseSettings): # Declaring PostgresDsn automatically provisions a serverless PostgreSQL cluster # if no value is provided in .env or environment variables psql_uri: PostgresDsn # A single connection serves all AI model types: Chat, Vision, Embedding, AudioTranscriber. # AiBaseAPI is for OpenAI-compatible endpoints (base URL includes /v1). # AiBaseUrl is for Anthropic-compatible endpoints (base URL only, no /v1). # AiToken is the API key for either provider. ai_api: AiBaseAPI ai_token: AiToken project = Project( name="my-api", shared_secrets=[AppSettings], # A uvicorn app named `my_app` exposing port 8000 my_app=NetworkApp( command=["/bin/bash", "-c", "uvicorn src.app:app --host 0.0.0.0 --port 8000"], port=8000, docker_image=None # None means using the managed Python image ), # A scheduled job named `background_job` background_job=Job( materialize_job, cron_schedule=value_for(prod="0 3 * * *", default=None), # Runs daily at 3 AM in prod arguments=MaterializeArgs(), ), # Override the default serverless PostgreSQL to enable extensions. # Without this, Takk provisions a ServerlessPostgresInstance with default settings. default=ServerlessPostgresInstance(extensions=["pgvector"]), ) ``` Then run: ```bash uv run takk up # Local development with automatic hot reloading for uvicorn uv run takk deploy --env qa # Cloud deployment ``` ## Contact - Documentation: https://docs.takk.cloud - Takk Cloud: https://takk.dev/