API Reference

Helper Functions

Helper methods for Cloud Storage.

cloudstorage.helpers.file_checksum(filename: str, hash_type: str = 'md5', block_size: int = 4096) → str[source]

Returns checksum for file.

from cloudstorage.helpers import file_checksum

picture_path = '/path/picture.png'
file_checksum(picture_path, hash_type='sha256')
# '03ef90ba683795018e541ddfb0ae3e958a359ee70dd4fccc7e747ee29b5df2f8'

Source: get-md5-hash-of-big-files-in-python

Parameters:
  • filename (str) – File path.
  • hash_type (str) – Hash algorithm function name.
  • block_size (int) – (optional) Chunk size.
Returns:

Hex digest of file.

Return type:

hash.hexdigest()

Raises:

RuntimeError – If the hash algorithm is not found in hashlib.

cloudstorage.helpers.file_content_type(filename: typing.Union[str, typing.IO[_io.BytesIO], _io.BytesIO, _io.FileIO, _io.TextIOWrapper]) → typing.Union[str, NoneType][source]

Guess content type for file path or file like object.

Parameters:filename (str or file) – File path or file like object.
Returns:Content type.
Return type:str
cloudstorage.helpers.read_in_chunks(file_object: _io.FileIO, block_size: int = 4096) → typing.Iterable[bytes][source]

Return a generator which yields data in chunks.

Source: read-file-in-chunks-ram-usage-read-strings-from-binary-file

Parameters:
  • file_object (file object) – File object to read in chunks.
  • block_size (int) – (optional) Chunk size.
Yield:

The next chunk in file object.

Yield type:

bytes

cloudstorage.helpers.validate_file_or_path(filename: typing.Union[str, typing.IO[_io.BytesIO], _io.BytesIO, _io.FileIO, _io.TextIOWrapper]) → typing.Union[str, NoneType][source]

Return filename from file path or from file like object.

Source: rackspace/pyrax/object_storage.py

Parameters:filename (str or file) – File path or file like object.
Returns:Filename.
Return type:str
Raises:FileNotFoundError – If the file path is invalid.

Utility Functions

Utility methods for Cloud Storage.

cloudstorage.utils.rgetattr(obj, attr, default=<object object>)[source]

Get a nested named attribute from an object.

Example:

b = type('B', (), {'c': True})()
a = type('A', (), {'b': b})()
# True

Source: getattr-and-setattr-on-nested-objects

Parameters:
  • obj (object) – Object.
  • attr (str) – Dot notation attribute name.
  • default (object) – (optional) Sentinel value, defaults to object().
Returns:

Attribute value.

Return type:

object

cloudstorage.utils.rsetattr(obj, attr, val)[source]

Sets the nested named attribute on the given object to the specified value.

Example:

b = type('B', (), {'c': True})()
a = type('A', (), {'b': b})()
rsetattr(a, 'b.c', False)
# False

Source: getattr-and-setattr-on-nested-objects

Parameters:
  • obj (object) – Object.
  • attr (str) – Dot notation attribute name.
  • val (object) – Value to set.
Returns:

NoneType

Return type:

None

Exceptions

Exceptions for Cloud Storage errors.

exception cloudstorage.exceptions.CloudStorageError(message: str) → None[source]

Base class for exceptions.

exception cloudstorage.exceptions.NotFoundError(message: str) → None[source]

Raised when a container or blob does not exist.

exception cloudstorage.exceptions.IsNotEmptyError(message: str) → None[source]

Raised when the container is not empty.

exception cloudstorage.exceptions.SignatureExpiredError → None[source]

Raised when signature timestamp is older than required maximum age.

Logging

By default, Cloud Storage logs to logging.NullHandler. To attach a log handler:

import logging

logger = logging.getLogger('cloudstorage')
logger.setLevel(logging.DEBUG)

ch = logging.StreamHandler()
ch.setLevel(logging.DEBUG)

formatter = logging.Formatter(
    '%(asctime)s - %(name)s.%(funcName)s - %(levelname)s - %(message)s')

ch.setFormatter(formatter)
logger.addHandler(ch)