Exploring the Google Cloud Stream Processing Framework

114

Learn more about how to get started with creating pipelines in Python with Google’s Dataflow framework.

Google’s Dataflow framework is a data processing engine that supports various types of data-related jobs: ETL, batch jobs and most–importantly stream processing. It is based a programming model that takes into account complex problems such as balancing latency vs. consistency, dealing with out of order or late data. You can run data flow pipelines on the Google cloud platform where efficient use of resources in abstracted and let you focus on solving the problems you care about and not fiddle with moving bits around. – See more at: DevX