Kedro is a development workflow framework which aims to become the industry standard for developing production-ready code. Kedro helps structure your data pipeline using software engineering principles, eliminating project delays due to code rewrites and thereby providing more time to focus on building robust pipelines. Additionally, the framework provides a standardised approach to collaboration for teams building robust, scalable, deployable, reproducible and versioned data pipelines.

The features provided by Kedro include:

  • A standard and easy-to-use project template, allowing your collaborators to spend less time understanding how you’ve set up your analytics project
  • Data abstraction, managing how you load and save data so that you don’t have to worry about the reproducibility of your code in different environments
  • Configuration management, helping you keep credentials out of your code base
  • Promotes test-driven development and industry standard code quality, decreasing operational risks for businesses
  • Modularity, allowing you to break large chunks of code into smaller self-contained and understandable logical units
  • Pipeline visualisation making it easy to see how your data pipeline is constructed
  • Seamless packaging, allowing you to ship your projects to production, e.g. using Docker or Airflow
  • Versioning for your datasets and machine learning models whenever your pipeline runs

Kedro is suitable for a wide range of applications, ranging from single-user projects, to enterprise-level software driving business decisions backed by machine learning models.

Read the full article here.

Related Post