Skip to content

S3-Compatible Access

Oceanum Storage provides an S3-compatible API, allowing you to use standard AWS S3 tools and libraries to interact with your storage.

SettingValue
Endpoint URLhttps://storage.oceanum.io
Regionauto

You’ll need your Oceanum.io access credentials to authenticate:

  1. Log in to Oceanum.io Platform
  2. Navigate to your account settings
  3. Generate or retrieve your access key and secret key

Install the AWS CLI if you haven’t already:

Terminal window
pip install awscli

Configure a profile for Oceanum Storage:

Terminal window
aws configure --profile oceanum

Enter your credentials when prompted:

  • AWS Access Key ID: Your Oceanum access key
  • AWS Secret Access Key: Your Oceanum secret key
  • Default region name: auto
  • Default output format: json

List buckets:

Terminal window
aws s3 --endpoint-url=https://storage.oceanum.io ls

List contents of a bucket:

Terminal window
aws s3 --endpoint-url=https://storage.oceanum.io ls s3://my-org-bucket/

Upload a file:

Terminal window
aws s3 --endpoint-url=https://storage.oceanum.io cp myfile.nc s3://my-org-bucket/data/

Download a file:

Terminal window
aws s3 --endpoint-url=https://storage.oceanum.io cp s3://my-org-bucket/data/myfile.nc ./

Sync a directory:

Terminal window
aws s3 --endpoint-url=https://storage.oceanum.io sync ./local-datadir s3://my-org-bucket/sensors

:::tip Example Here’s a real-world example syncing sensor data to an organization bucket:

Terminal window
aws s3 --endpoint-url=https://storage.oceanum.io sync ./root_datadir s3://oceanum-eda-org-port-taranaki/sensors

:::

import boto3
# Create client
s3 = boto3.client(
's3',
endpoint_url='https://storage.oceanum.io',
aws_access_key_id='YOUR_ACCESS_KEY',
aws_secret_access_key='YOUR_SECRET_KEY',
)
# List buckets
response = s3.list_buckets()
for bucket in response['Buckets']:
print(bucket['Name'])
# Upload a file
s3.upload_file('local_file.nc', 'my-bucket', 'data/remote_file.nc')
# Download a file
s3.download_file('my-bucket', 'data/remote_file.nc', 'downloaded_file.nc')
# List objects in a bucket
response = s3.list_objects_v2(Bucket='my-bucket', Prefix='data/')
for obj in response.get('Contents', []):
print(obj['Key'])
Terminal window
pip install s3cmd

Create ~/.s3cfg:

[default]
access_key = YOUR_ACCESS_KEY
secret_key = YOUR_SECRET_KEY
host_base = storage.oceanum.io
host_bucket = %(bucket)s.storage.oceanum.io
use_https = True
Terminal window
# List buckets
s3cmd ls
# Upload file
s3cmd put myfile.nc s3://my-bucket/data/
# Download file
s3cmd get s3://my-bucket/data/myfile.nc

Oceanum Storage supports the following S3 operations:

OperationSupported
ListBucketsYes
CreateBucketYes
DeleteBucketYes
ListObjectsYes
GetObjectYes
PutObjectYes
DeleteObjectYes
CopyObjectYes
HeadObjectYes
Multipart UploadYes
Presigned URLsYes

For large files (>100MB), multipart uploads are recommended for reliability. Most S3 tools handle this automatically, but you can configure the threshold:

Terminal window
# AWS CLI - set multipart threshold to 100MB
aws configure set s3.multipart_threshold 100MB --profile oceanum

Generate temporary URLs to share files without exposing credentials:

url = s3.generate_presigned_url(
'get_object',
Params={'Bucket': 'my-bucket', 'Key': 'data/file.nc'},
ExpiresIn=3600 # URL valid for 1 hour
)
print(url)