Reading/writing to storage
Last updated
Was this helpful?
Last updated
Was this helpful?
Terality uses the same methods as pandas to load data, such as read_csv,
read_parquet
and similar. Example:
You can import data just as you would do using pandas, for example using a read_csv
or a read_parquet
on your local file or your cloud storage (such as AWS S3). You can find the currently supported functions in the section.
You can also by specifying a folder path to the read method. This is supported for the following functions
read_csv
read_parquet
read_excel
read_json
Do not hesitate to contact us if you want us to implement any other read function.
In addition, Terality provides a way to convert pandas objects into Terality structures, using the method.
If you're done working on your DataFrame for the moment, or it's still too big to be held in memory on your computer, you may want to download and save it back on your computer's drive/cloud storage. To do this, you can simply use the same API as pandas:
Best practice: we recommend adopting a modern and scalable data workflow by using:
a cloud storage rather than local storage (to avoid having transfers being limited by your bandwidth)
a modern, fast, scalable and powerful data format such as parquet, rather than CSV
You can also using to_csv_folder
or to_parquet_folder
from the Terality API.