Logo
find AI list
TasksToolsCompareWorkflows
Submit ToolSubmit
Log in
Logo
find AI list

Search by task, compare top tools, and use proven workflows to choose the right AI tool faster.

Platform

  • Tasks
  • Tools
  • Compare
  • Alternatives
  • Workflows
  • Reports
  • Best Tools by Persona
  • Best Tools by Role
  • Stacks
  • Models
  • Agents
  • AI News

Company

  • About
  • Blog
  • FAQ
  • Contact
  • Editorial Policy
  • Privacy
  • Terms

Contribute

  • Submit Tool
  • Manage Tool
  • Request Tool

Stay Updated

Get new tools, workflows, and AI updates in your inbox.

© 2026 findAIList. All rights reserved.

Privacy PolicyTerms of ServiceEditorial PolicyRefund Policy
Home/Tasks/Kedro
Kedro logo

Kedro

Visit Website

Quick Tool Decision

Should you use Kedro?

The open-source Python framework for reproducible, maintainable, and modular data science code.

Category

Data & ML

Data confidence: release and verification fields are source-audited when available; other summary fields are community-aggregated.

Visit Tool WebsiteOpen Detailed Profile
OverviewFAQPricingAlternativesReviews

Overview

Kedro is an open-source Python framework designed to help data scientists and engineers create production-ready data pipelines. Originally developed by McKinsey's QuantumBlack and now a part of the LF AI & Data Foundation, Kedro addresses the 'notebook-to-production' gap by enforcing software engineering best practices—such as modularity, separation of concerns, and versioning—within the data science workflow. Its architecture centers around a 'Data Catalog' which abstracts data access, and a 'Pipeline' structure composed of 'Nodes' (pure Python functions). This decoupling allows teams to swap data sources or execution environments without rewriting core logic. In the 2026 market, Kedro remains the gold standard for enterprise-grade data orchestration where governance, auditability, and team collaboration are paramount. It integrates seamlessly with modern stack components like MLflow, Great Expectations, and Airflow, providing a standardized project structure (based on Cookiecutter) that enables rapid onboarding and scale-out capabilities across distributed computing environments like Apache Spark or Dask.

Common tasks

Data Pipeline OrchestrationETL DevelopmentMachine Learning EngineeringData CatalogingSoftware Engineering for Data Science

FAQ

View all

Full FAQ is available in the detailed profile.

FAQ+-

Full FAQ is available in the detailed profile.

View all

Pricing

View pricing

Pricing varies

Plan-level pricing details are still being validated for this tool.

Pros & Cons

Pros/cons are still being audited for this tool.

Reviews & Ratings

Share your experience, and users can reply directly under each review.

Reviews load as you scroll.
Need advanced specs, integrations, implementation notes, and deeper comparisons? Open the Detailed Profile.

Pricing varies

Model not listed

ReviewsVisit