You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The most basic way of invoking Python is python -c "print('Hello, world.')". However, more often than not, dependencies need to be satisfied before starting it, you want to see the code finish fast, and also, you may not know in advance if the code can be trusted.
Barentsz can explore and discover modules, classes, functions, and attributes.
Runtime
Mostly about compute matters, to run code efficiently and in isolation.
Apache Spark is a unified analytics engine for large-scale data processing.
Dask.distributed is a lightweight library for distributed computing in Python. It extends both the concurrent.futures and dask APIs to moderate sized clusters.
Pyper is a flexible framework for concurrent and parallel data-processing, based on functional programming patterns. Used for 🔀 ETL Systems, ⚙️ Data Microservices, and 🌐 Data Collection.
Ray is a unified framework for scaling AI and Python applications. Ray consists of a core distributed runtime and a set of AI libraries for simplifying ML compute.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
About
The most basic way of invoking Python is
python -c "print('Hello, world.')". However, more often than not, dependencies need to be satisfied before starting it, you want to see the code finish fast, and also, you may not know in advance if the code can be trusted.Discovery
Before running code, let's discover it.
Runtime
Mostly about compute matters, to run code efficiently and in isolation.
Apache Spark is a unified analytics engine for large-scale data processing.
Dask.distributed is a lightweight library for distributed computing in Python. It extends both the
concurrent.futuresanddaskAPIs to moderate sized clusters.Pyper is a flexible framework for concurrent and parallel data-processing, based on functional programming patterns. Used for 🔀 ETL Systems, ⚙️ Data Microservices, and 🌐 Data Collection.
Ray is a unified framework for scaling AI and Python applications. Ray consists of a core distributed runtime and a set of AI libraries for simplifying ML compute.
Beta Was this translation helpful? Give feedback.
All reactions