# Datazone ## Docs - [Change Logs](https://docs.datazone.co/changelog.md): Release notes and feature updates for Datazone - [Private Cloud](https://docs.datazone.co/deployment/private-cloud.md): Enterprise deployment of Datazone in your cloud environment - [Public Cloud](https://docs.datazone.co/deployment/public-cloud.md): Datazone public cloud deployment for community and pro users - [From Zero to Production!](https://docs.datazone.co/from-zero-to-production.md): In this guide, you will learn how to build a Data Lakehouse from scratch using Datazone, including creating intelligent apps, deploying AI agents, and exposing data via secure endpoints. - [Installation](https://docs.datazone.co/installation.md): Before starting to use Datazone, you need to install the Datazone Python CLI. The client is a Python package that provides a simple interface to interact with the Datazone API. - [Introduction](https://docs.datazone.co/introduction.md): Build, manage, and serve your data pipelines and intelligent applications with ease using Datazone in minutes. - [Key Concepts](https://docs.datazone.co/key-concepts.md): Datazone is a modern data platform that simplifies your data engineering journey by providing a unified environment for data ingestion, processing, analysis, and AI-driven automation. It seamlessly connects your data sources to a robust data lakehouse while offering powerful tools for transformation… - [Best Practices](https://docs.datazone.co/reference/agents/best-practices.md): Optimize your agents for better performance, accuracy, and cost efficiency - [Chat Interface](https://docs.datazone.co/reference/agents/chat.md): Interact with your agents through conversational AI - [Overview](https://docs.datazone.co/reference/agents/overview.md): Create custom AI agents to interact with your data through natural language - [Kernels](https://docs.datazone.co/reference/analysis/kernel.md): Kernels are the execution environments for your Datazone notebooks. - [Notebooks](https://docs.datazone.co/reference/analysis/notebook.md): Notebooks are the interactive documents that you can write and run your code in Datazone. - [Toolkit](https://docs.datazone.co/reference/analysis/toolkit.md): The Toolkit is a collection of tools and utilities that help you manage your notebooks in Datazone. - [Access Keys](https://docs.datazone.co/reference/development/access-keys.md): Create and manage access keys for programmatic access to your project resources - [Actions](https://docs.datazone.co/reference/development/actions.md): Deploy serverless Python functions to automate workflows and extend agent capabilities - [API Keys](https://docs.datazone.co/reference/development/api-key.md): API keys are used to authenticate requests to the Datazone API. You can create and manage your API keys from the Datazone dashboard. - [Command Line](https://docs.datazone.co/reference/development/command-line.md): One of the interaction way with Datazone is using Command Line Interface. You can manage your projects, datasets, and models with Datazone CLI commands. - [Context](https://docs.datazone.co/reference/development/context.md): Each pipeline in Datazone has a context object that provides access to resources and configuration settings. - [File Container](https://docs.datazone.co/reference/development/file-container.md): File Container is a storage solution in the Datazone platform that allows you to manage and store files. You can create, update, and delete file containers, and use them to store data for your pipelines and notebooks. - [Model Accounts](https://docs.datazone.co/reference/development/model-accounts.md): Configure and manage AI model provider credentials for building intelligent applications with Datazone - [Pipeline](https://docs.datazone.co/reference/development/pipeline.md): Pipelines are the main building blocks of your project. You can define your data processing steps in a pipeline file and deploy it to Datazone. - [Policy](https://docs.datazone.co/reference/development/policy.md): Role-based access control with hierarchical permissions for fine-grained authorization - [Project Repository](https://docs.datazone.co/reference/development/project.md): Organize and deploy your pipelines, actions, apps, and endpoints in a single project structure. - [Transform](https://docs.datazone.co/reference/development/transform.md): Transform functions are the bricks of your pipeline. You can atomize your data processing steps into small functions and chain them together to build a pipeline. - [Variables](https://docs.datazone.co/reference/development/variables.md): Variables are used to store and manage data in the Datazone platform. You can define variables in the Datazone dashboard and use them in your pipelines. - [Vectors](https://docs.datazone.co/reference/development/vectors.md): Transform your data into searchable embeddings for AI-powered applications - [API Key Authentication](https://docs.datazone.co/reference/integration/authentication/api-key.md): Learn how to authenticate with Datazone using API keys - [Azure AD (Entra ID) SAML Setup](https://docs.datazone.co/reference/integration/authentication/saml/azure-ad.md): Configure SAML authentication with Microsoft Azure Active Directory - [Google Workspace SAML Setup](https://docs.datazone.co/reference/integration/authentication/saml/google-workspace.md): Configure SAML authentication with Google Workspace - [Okta SAML Setup](https://docs.datazone.co/reference/integration/authentication/saml/okta.md): Configure SAML authentication with Okta - [SAML Authentication](https://docs.datazone.co/reference/integration/authentication/saml/overview.md): Configure SAML-based single sign-on for your Datazone instance - [Endpoints](https://docs.datazone.co/reference/integration/endpoints.md): Create custom API endpoints for secure data access - [ODBC/JDBC Connections](https://docs.datazone.co/reference/integration/odbc-jdbc-connection.md): Connect to Datazone using Clickhouse ODBC or JDBC drivers - [Overview](https://docs.datazone.co/reference/integration/overview.md): Overview of Datazone Integration and API capabilities - [Views](https://docs.datazone.co/reference/integration/views.md): Create optimized relational database views from your datasets with advanced partitioning and indexing - [Components](https://docs.datazone.co/reference/intelligent-apps/components.md): Detailed reference for components in Datazone Intelligent Apps - [Embedding](https://docs.datazone.co/reference/intelligent-apps/embedding.md): You can build a Data Intensive Application in a couple of minutes with Datazone. - [Filters](https://docs.datazone.co/reference/intelligent-apps/filters.md): How to use filters for interactivity in Datazone Intelligent Apps. - [Orion AI (LLM Assistant)](https://docs.datazone.co/reference/intelligent-apps/orion-ai.md): Overview of Orion AI, the LLM-powered assistant for Intelligent Apps. - [Overview](https://docs.datazone.co/reference/intelligent-apps/overview.md): Build interactive data applications with Datazone's Intelligent Apps feature - [Variable Usage](https://docs.datazone.co/reference/intelligent-apps/query-manipulation.md): Examples of using variables and Jinja-style templating in Intelligent App queries. - [YAML Reference](https://docs.datazone.co/reference/intelligent-apps/yaml-reference.md): Comprehensive reference for all YAML attributes in Intelligent App definitions. - [Channels](https://docs.datazone.co/reference/platform/channels.md): Configure notification channels for sending reports and alerts - [Reports](https://docs.datazone.co/reference/platform/reports.md): Schedule and automate intelligent app report delivery - [Resources Overview](https://docs.datazone.co/reference/platform/resources/overview.md): Understanding Datazone resource types and their usage - [Quotas](https://docs.datazone.co/reference/platform/resources/quotas.md): Set limits and manage resource consumption with quotas - [AWS S3 CSV](https://docs.datazone.co/reference/sources/aws-s3.md): AWS S3 is a scalable object storage service that can be used to store and retrieve files. - [Azure Blob Storage](https://docs.datazone.co/reference/sources/azure-blob.md): Azure Blob Storage is Microsoft's object storage solution for the cloud, designed to store massive amounts of unstructured data. - [MongoDB](https://docs.datazone.co/reference/sources/mongodb.md): MongoDB is a popular NoSQL database that stores data in flexible, JSON-like documents. - [Microsoft SQL Server](https://docs.datazone.co/reference/sources/mssqlserver.md): Microsoft SQL Server is a relational database management system developed by Microsoft. - [MySQL](https://docs.datazone.co/reference/sources/mysql.md): MySQL is an open-source relational database management system (RDBMS). - [Oracle](https://docs.datazone.co/reference/sources/oracle.md): Oracle is a powerful enterprise-grade relational database management system. - [Data Ingestion](https://docs.datazone.co/reference/sources/overview.md): Data ingestion is the first step in Datazone's data journey. - [PostgreSQL](https://docs.datazone.co/reference/sources/postgresql.md): PostgreSQL is a powerful, open source object-relational database system. - [SAP HANA](https://docs.datazone.co/reference/sources/sap-hana.md): SAP HANA is an in-memory, column-oriented, relational database management system. - [UI Overview](https://docs.datazone.co/reference/ui-overview.md): Navigate Datazone's main interface sections - [Datazone SDK](https://docs.datazone.co/tutorial/datazone-sdk.md): Learn how to use Datazone SDK to access and manage your data in Datazone in your local environment. - [Building an AI-Powered Customer Response Automation System with Datazone](https://docs.datazone.co/tutorial/examples/ai-powered-message-automation.md): We built this cool email response system using Datazone, and its handling customer support emails like a champ! 🚀 - [Pipeline Examples](https://docs.datazone.co/tutorial/examples/pipeline-examples.md): Learn how to create and manage data pipelines in Datazone through practical examples - [Pyspark Examples in Datazone Transforms](https://docs.datazone.co/tutorial/examples/pyspark-transform-examples.md) ## OpenAPI Specs - [openapi](https://docs.datazone.co/api-reference/openapi.json) ## Optional - [Blog](https://www.datazone.co/blog)