SkillAgentSearch skills...

Transparentpath

A class that allows one to use a path in a local file system or a gcs file system (more or less) in almost the same way one would use a pathlib.Path object.

Install / Use

/learn @Advestis/Transparentpath
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

doc License: GPL v3

Status

pytests push-pypi push-doc

maintained issues pr

Compatibilities

ubuntu unix Windows

python

This package is not maintained for python 3.6 anymore. The latest version available for python 3.6 is 0.1.129. It should work on Python 3.7, even though it is not tested with it. Please use python == 3.8. Python 3.9 and above fail to create the tar.gz.

Contact

linkedin website mail

TransparentPath

A class that allows one to use a path in a local file system or a Google Cloud Storage (GCS) file system in the same way one would use a pathlib.Path object. One can use many different GCP projects at once.

Requirements

You will need GCP credentials, either as a .json file, that you can set in the envvar GOOGLE_APPLICATION_CREDENTIALS, or by running directly in a google cloud instance (VM, pods, etc...).

Installation

You can install this package with pip :

pip install transparentpath

Optional packages

The vanilla version allows you to declare paths and work with them. You can use them in the builtin open method. Optionally, you can also install support for several other packages like pandas, dask, etc... the currently available optionnal packages are accessible through the follownig commands:

pip install transparentpath[pandas]
pip install transparentpath[parquet]
pip install transparentpath[hdf5]
pip install transparentpath[json]
pip install transparentpath[excel]
pip install transparentpath[dask]

you can install all of those at once

pip install transparentpath[all]

Usage

Create a path that points to GCS, and one that does not:

from transparentpath import Path
# Or : from transparentpath import TransparentPath
p = Path("gs://mybucket/some_path", token="some/cred/file.json")
p2 = p / "foo"  # Will point to gs://mybucket/some_path/foo
p3 = Path("bar")  # Will point to local path "bar"

Set all paths to point to GCS by default:

from transparentpath import Path
Path.set_global_fs("gcs", token="some/cred/file.json")
p = Path("mybucket") / "some_path" # Will point to gs://mybucket/some_path
p2 = p / "foo"  # Will point to gs://mybucket/some_path/foo
p3 = Path("bar", fs="local")  # Will point to local path "bar"
p4 = Path("other_bucket")  # Will point to gs://other_bucket (assuming other_bucket is a bucket on GCS)
p5 = Path("not_a_bucket")  # Will point to local path "not_a_bucket" (assuming it is indeed, not a bucket on GCS)

Set all paths to point to severral GCS projects by default:

from transparentpath import Path
Path.set_global_fs("gcs", token="some/cred/file.json")
Path.set_global_fs("gcs", token="some/other/cred/file.json")
p = Path("mybucket") / "some_path" # Will point to gs://mybucket/some_path
p2 = p / "foo"  # Will point to gs://mybucket/some_path/foo
p3 = Path("bar", fs="local")  # Will point to local path "bar"
p4 = Path("other_bucket")  # Will point to gs://other_bucket (assuming other_bucket is a bucket on GCS)
p5 = Path("not_a_bucket")  # Will point to local path "not_a_bucket" (assuming it is indeed, not a bucket on GCS)

Here, mybucket and other_bucket can be on two different projects, as long as at least one of the credential files can access them.

Set all paths to point to GCS by default, and specify a default bucket:

from transparentpath import Path
Path.set_global_fs("gcs", bucket="mybucket", token="some/cred/file.json")
p = Path("some_path")  # Will point to gs://mybucket/some_path/
p2 = p / "foo"  # Will point to gs://mybucket/some_path/foo
p3 = Path("bar", fs="local")  # Will point to local path "bar"
p4 = Path("other_bucket")  # Will point to gs://mybucket/other_bucket
p5 = Path("not_a_bucket")  # Will point to gs://mybucket/not_a_bucket

The latest option is interesting if you have a code that should be able to run with paths being sometimes remote, sometimes local. To do that, you can use the class attribute nas_dir. Then when a path is created, if it starts by nas_dir's path, nas_dir's path is replaced by the bucket name. This is useful if, for instance, you have a backup of a bucket locally at let's say /my/local/backup. Then you can do:

from transparentpath import Path
Path.nas_dir = "/my/local/backup"
Path.set_global_fs("gcs", bucket="mybucket", token="some/cred/file.json")
p = Path("some_path")  # Will point to gs://mybucket/some_path/
p3 = Path("/my/local/backup") / "some_path"  # Will ALSO point to gs://mybucket/some_path/
from transparentpath import Path
Path.nas_dir = "/my/local/backup"
# Path.set_global_fs("gcs", bucket="mybucket", token="some/cred/file.json")
p = Path("some_path")  # Will point to /my/local/backup/some_path/
p3 = Path("/my/local/backup") / "some_path"  # Will ALSO point to /my/local/backup/some_path/

In all the previous examples, the token argument can be ommited if the environment variable GOOGLE_APPLICATION_CREDENTIALS is set and point to a .json credential file, or if your code runs on a GCP machine (VM, cluster...) with access to GCS.

No matter whether you are using GCS or your local file system, here is a sample of what TransparentPath can do:

from transparentpath import Path
# Path.set_global_fs("gcs", bucket="bucket_name", project="project_name")
# The following lines will also work with the previous line uncommented 

# Reading a csv into a pandas' DataFrame and saving it as a parquet file
mypath = Path("foo") / "bar.csv"
df = mypath.read(index_col=0, parse_dates=True)
otherpath = mypath.with_suffix(".parquet")
otherpath.write(df)

# Reading and writing a HDF5 file works on GCS and on local:
import numpy as np
mypath = Path("foo") / "bar.hdf5"  # can be .h5 too
with mypath.read() as ifile:
    arr = np.array(ifile["store1"])

# Doing '..' from 'foo/bar.hdf5' will return 'foo'
# Then doing 'foo' + 'babar.hdf5' will return 'foo/babar.hdf5' ('+' and '/' are synonymes)
mypath.cd("..")  # Does not return a path but modifies inplace
with (mypath  + "babar.hdf5").write(None) as ofile:
    # Note here that we must explicitely give 'None' to the 'write' method in order for it
    # to return the open HDF5 file. We could also give a dict of {arr: "store1"} to directly
    # write the file.
    ofile["store1"] = arr


# Reading a text file. Can also use 'w', 'a', etc... also works with binaries.
mypath = Path("foo") / "bar.txt"
with open(mypath, "r") as ifile:
    lines = ifile.readlines()

# open is overriden to understand gs://
with open("gs://bucket/file.txt", "r") as ifile:
    lines = ifile.readlines()

mypath.is_file()
mypath.is_dir()
mypath.is_file()
files = mypath.parent.glob("*.csv")  # Returns a Iterator[TransparentPath], can be casted to list

As you can see from the previous example, all methods returning a path from a TransparentPath return a TransparentPath.

Dask

TransparentPath supports writing and reading Dask dataframes from and to csv, excel, parquet and HDF5, both locally and remotely. You need to have dask-dataframe and dask-distributed installed, which will be the case if you ran pip install transparentpath[dask]. Writing Dask dataframes does not require any additionnal arguments to be passed for the type will be checked before calling the appropriate writting method. Reading however requires you to pass the use_dask argument to the read() method. If the file to read is HDF5, you will also need to specify set_names, matching the argument key of Dask's read_hdf() method.

Note that if reading a remote HDF5, the file will be downloaded in your local tmp, then read. If not using Dask, the file is deleted after being read. But since Dask uses delayed processes, deleting the file might occure before the file is actually read, so the file is kept. Up to you to empty your /tmp directory if it is not done automatically by your system.

Behavior

All instances of TransparentPath are absolute, even if created with relative paths.

TransparentPaths are seen as instances of str:

from transparentpath import Path
path = Path()
isinstance(path, str)  # returns True

This is required to allow

from transparentpath import Path
path = Path()
with open(path, "w/r/a/b...") as ifile:
    ...

to work. If you want to check whether path is actually a TransparentPath and nothing else, use

from transparentpath import Path
path = Path()
assert type(path) == Path
assert issubclass(path, Path)

instead.

Any method or attribute vali

Related Skills

View on GitHub
GitHub Stars8
CategoryDevelopment
Updated2y ago
Forks1

Languages

Python

Security Score

75/100

Audited on Jan 12, 2024

No findings