Skip to main content

Oracle Transportation Management Source

Connect Oracle Transportation Management (OTM) as a source to sync shipments, orders, releases, and logistics data.

Prerequisites

Before you begin, ensure you have:

  • An active Oracle OTM instance (Cloud or On-Premises)
  • Integration user credentials with API access
  • For Basic Auth: OTM username and password
  • For OAuth: IDCS application with Client Credentials grant
  • API permissions for Export APIs and Data Integration APIs
  • Network connectivity to your OTM server

Configuration

Step 1: Server Connection

Server URL*

OTM server base URL
Example: https://your-instance.otmgtm.region.ocs.oraclecloud.com
Format: https://[subdomain].otmgtm.[region].ocs.oraclecloud.com

Finding Your Server URL

Your OTM server URL is the base URL you use to access the OTM application. It typically follows the pattern:

  • Cloud: https://[company]-[env].otmgtm.[region].ocs.oraclecloud.com
  • Example: https://acme-prod.otmgtm.eu-frankfurt-1.ocs.oraclecloud.com

Do not include paths like /GC3/ or /logisticsRestApi/ - just the base URL.


Step 2: Authentication

Authentication Type*

Choose authentication method. Basic Auth uses username/password, OAuth uses IDCS token-based authentication.
Options:

  • basic (Default) - Direct username/password authentication
  • oauth - OAuth 2.0 Client Credentials via IDCS

Default: basic


Option A: Basic Authentication (Default)

Username*

OTM integration user username
Example: DOMAIN.USERNAME
Format: Typically includes domain prefix

Password*

OTM integration user password
Stored encrypted

Basic Auth Setup
  1. Create Integration User in OTM:

    • Log in to OTM as administrator
    • Go to AdministrationUser Management
    • Create a new user with integration permissions
    • Username format is typically DOMAIN.USERNAME (e.g., DEFAULT.SUPAFLOW_API)
  2. Grant Required Permissions:

    • Data Export: Access to Export Request APIs
    • Data Integration: Read access to logistics objects (Shipment, Order, Release, etc.)
    • Object Permissions: Read access to tables you want to sync
  3. Test Credentials:

    • Try logging into OTM web interface with these credentials
    • Verify user can access the objects you want to sync

Option B: OAuth 2.0 Authentication

Client ID*

OAuth2 Client ID from IDCS application
Copy from IDCS Confidential Application

Client Secret*

OAuth2 Client Secret from IDCS application
Stored encrypted

Token URL*

IDCS token endpoint URL
Example: https://idcs-xxx.identity.oraclecloud.com/oauth2/v1/token
Format: https://[idcs-tenant].identity.oraclecloud.com/oauth2/v1/token

OAuth Scope

OAuth scope for OTM API access
Leave empty for default scope
Example: urn:opc:resource:consumer::all

OAuth Setup via IDCS

Oracle OTM uses Oracle Identity Cloud Service (IDCS) for OAuth authentication.

1. Create IDCS Application:

  • Log in to Oracle Identity Cloud Service console
  • Navigate to ApplicationsAdd
  • Select Confidential Application
  • Enter application name (e.g., "Supaflow OTM Integration")
  • Click Next

2. Configure Client Credentials:

  • Select Configure this application as a client now
  • Under Allowed Grant Types, select Client Credentials
  • Under Grant the client access to Identity Cloud Service Admin APIs, select:
    • Authenticator Client (if available)
    • Or add the OTM resource scope manually
  • Click Next through remaining screens
  • Click Finish

3. Copy Credentials:

  • After creating the application, copy the Client ID and Client Secret
  • Find your IDCS Tenant name from the IDCS URL (e.g., idcs-abc123)
  • Construct Token URL: https://[idcs-tenant].identity.oraclecloud.com/oauth2/v1/token

4. Add OTM Resource:

  • If OTM is registered as a resource server in IDCS, add the appropriate scope
  • Typical format: urn:opc:resource:consumer::all or OTM-specific scope
  • Consult your Oracle administrator for the correct scope
OAuth Token Management

OAuth tokens are automatically managed by Supaflow:

  • Access Token: Automatically refreshed when expired
  • Token Expiry: Tracked internally (typically 1 hour)
  • No manual intervention needed after initial setup

Step 3: Export Settings

Export Mode*

Sync mode returns data directly (recommended for most use cases). Async mode exports to Object Storage for very large datasets (>100K rows).
Options:

  • sync (Default) - Direct synchronous data export via REST API
  • async - Asynchronous export to Oracle Object Storage

Default: sync

Export Modes

Sync Mode (Recommended):

  • Data returned directly in API response
  • Suitable for datasets up to ~100,000 rows
  • Faster for small to medium datasets
  • No additional Oracle Cloud resources needed

Async Mode:

  • Data exported to Oracle Object Storage
  • Required for very large datasets (>100K rows)
  • Supaflow downloads from Object Storage
  • Requires Object Storage bucket and Pre-Authenticated Request (PAR) URL

Option: Async Mode Configuration

Target System URL

Object Storage bucket Pre-Authenticated Request (PAR) URL for async exports. Required if Export Mode = async.
Example: https://objectstorage.region.oraclecloud.com/.../bucket.../o/
Format: Must end with /o/ for object uploads

Creating Object Storage PAR URL

If using async export mode, you need to provide an Oracle Object Storage PAR URL:

1. Create Object Storage Bucket:

  • Log in to Oracle Cloud Infrastructure (OCI) console
  • Navigate to StorageObject StorageBuckets
  • Click Create Bucket
  • Enter bucket name (e.g., supaflow-otm-exports)
  • Select Standard storage tier
  • Click Create

2. Create Pre-Authenticated Request (PAR):

  • Open the bucket you created
  • Click Pre-Authenticated Requests in the left menu
  • Click Create Pre-Authenticated Request
  • Configure:
    • Name: "Supaflow Write Access"
    • Access Type: Permit object writes or Read and write
    • Expiration: Set to future date (e.g., 1 year from now)
    • Prefix: Leave empty or use prefix like supaflow/
  • Click Create
  • IMPORTANT: Copy the PAR URL immediately - it's only shown once
  • PAR URL format: https://objectstorage.region.oraclecloud.com/p/{unique-token}/n/{namespace}/b/{bucket-name}/o/

3. Enter PAR URL in Supaflow:

  • Paste the complete PAR URL into the Target System URL field
  • Ensure URL ends with /o/
  • Save and test connection

Step 4: Advanced Settings (Optional)

Schema Refresh Interval

Interval in minutes for schema metadata refresh
Default: 60
Range: -1 to 10080

Options:

  • 0 = Refresh schema before every pipeline execution (recommended for frequent schema changes)
  • 60 = Refresh every hour (recommended for stable schemas)
  • -1 = Disable automatic schema refresh (use for very stable schemas)
Incremental Lookback

How many seconds to look back when resuming incremental syncs to capture late-arriving updates
Default: 0
Range: 0 to 86400 (24 hours)
Recommended: 300-600 seconds in production

When to Adjust Lookback Time
  • Set to 300-600 (5-10 minutes) if you have late-arriving logistics updates
  • Use 0 if data arrives in real-time with accurate timestamps
  • Higher values ensure completeness but may cause duplicate processing
  • OTM systems often have delayed updates due to integration workflows

Step 5: Test & Save

After configuring your authentication and settings, click Test & Save to verify your connection and save the source.

Available Objects

Supaflow syncs the following OTM objects via Export APIs:

Core Tables

ObjectPrimary KeyDescription
SHIPMENTSHIPMENT_GIDShipment records containing logistics and transportation data
LOCATIONLOCATION_GIDLocation records containing facility and address information
ITEMITEM_GIDItem records containing product and material master data
CONTACTCONTACT_GIDContact records containing person and organization contact information

Order Tables

ObjectPrimary KeyDescription
ORDER_RELEASEORDER_RELEASE_GIDOrder release records representing customer orders for fulfillment
ORDER_RELEASE_STATUSORDER_RELEASE_GIDOrder release status history and tracking information

Shipment Child Tables

ObjectPrimary KeyDescription
SHIPMENT_COSTSHIPMENT_COST_GIDShipment cost records for freight charges and accessorials
SHIPMENT_COST_QUALSHIPMENT_COST_GIDShipment cost qualifiers providing additional cost attributes
SHIPMENT_COST_REFSHIPMENT_COST_GIDShipment cost reference numbers and external identifiers
SHIPMENT_COST_REMARKSHIPMENT_COST_GIDShipment cost remarks and notes
SHIPMENT_STATUSSHIPMENT_GIDShipment status history and milestone tracking
SHIPMENT_STOPSHIPMENT_GIDShipment stop locations for pickup and delivery points
SHIP_UNITSHIP_UNIT_GIDShip unit records representing handling units and containers

Reference Tables

ObjectPrimary KeyDescription
ADJUSTMENT_REASONADJUSTMENT_REASON_GIDAdjustment reason codes for inventory and cost adjustments
ALLOCATIONALLOCATION_GIDAllocation records for inventory and capacity reservations
AUDIT_TRAILAUDIT_TRAIL_GIDAudit trail records tracking data changes and user actions
Object Discovery

All objects use UPDATE_DATE as the cursor field for incremental sync. Objects are discovered dynamically via the Export API - if a table has no data, it will not appear in schema discovery but will become available once data exists.

Incremental Sync

OTM connector supports time-based incremental sync using cursor fields:

  • Cursor field: UPDATE_DATE or LAST_UPDATE_DATE (varies by object)
  • Sync mode: Time-range based with lookback support
  • Strategy: Cutoff time approach to handle late-arriving data

How it works:

  1. Each sync captures a cutoff time (current time when sync starts)
  2. Fetches all records updated between last cursor and cutoff time
  3. Next sync starts from the previous cutoff time
  4. Lookback time ensures late updates are captured

Troubleshooting

Common issues and their solutions:

Basic Auth connection failed

Problem:

  • "Authentication failed" error
  • "401 Unauthorized" response
  • Connection test fails

Solutions:

  1. Verify credentials:
    • Check username format includes domain (e.g., DEFAULT.USERNAME)
    • Ensure password is correct (no extra spaces)
    • Try logging into OTM web interface with these credentials
  2. Check user permissions:
    • User must have Data Integration role
    • User must have Export permissions
    • Verify user is not locked or disabled
  3. Check server URL:
    • Ensure URL is correct and accessible
    • URL should be base URL only (no /GC3/ suffix)
    • Try accessing {serverUrl}/GC3/ in a browser
  4. Network connectivity:
    • Verify OTM server is accessible from Supaflow
    • Check if corporate firewall blocks the connection
    • Contact Oracle support if server appears down

OAuth authentication failed

Problem:

  • "Invalid OAuth credentials" error
  • "Failed to obtain access token" error
  • Token request returns 401/403

Solutions:

  1. Verify IDCS credentials:
    • Client ID and Client Secret are correct
    • Credentials copied from IDCS application without extra spaces
    • IDCS application is Activated (not Deactivated)
  2. Check Token URL:
    • Token URL format: https://[tenant].identity.oraclecloud.com/oauth2/v1/token
    • Replace [tenant] with your actual IDCS tenant name
    • No typos in the URL
  3. Verify grant type:
    • IDCS application must have Client Credentials grant enabled
    • Check under ConfigurationClient Configuration
  4. Check OAuth scope:
    • If scope is required, ensure it matches OTM resource scope
    • Try leaving scope blank (uses default)
    • Consult Oracle administrator for correct scope
  5. IDCS application status:
    • Application must be Activated
    • Check application is not expired
    • Verify application has not been revoked

No objects appearing

Problem:

  • Successfully connected but no OTM objects show up
  • Schema is empty
  • "No objects found" warning

Solutions:

  1. Check user permissions:
    • User must have Read access to OTM objects
    • Verify Data Export permissions in OTM
    • Check Data Integration role is assigned
  2. Verify Export API access:
    • User can access /logisticsRestApi/data-int/v1/exportRequests/
    • Try making a manual API call with user credentials
    • Check Oracle documentation for required permissions
  3. Check OTM configuration:
    • Ensure OTM instance has objects configured
    • Verify objects are not hidden by security policy
    • Contact Oracle administrator to verify instance setup
  4. Refresh schema:
    • Set Schema Refresh Interval to 0
    • Save and test connection again
    • Wait a few minutes for schema discovery to complete

Async export not working

Problem:

  • Async mode fails with "Target System URL" error
  • Data not appearing in Object Storage
  • Connection test succeeds but exports fail

Solutions:

  1. Verify PAR URL:
    • PAR URL must end with /o/
    • URL format: https://objectstorage.{region}.oraclecloud.com/p/{token}/n/{namespace}/b/{bucket}/o/
    • No extra spaces or characters in URL
  2. Check PAR expiration:
    • Pre-Authenticated Request may have expired
    • Go to OCI → Object Storage → Bucket → Pre-Authenticated Requests
    • Check expiration date and create new PAR if expired
  3. Check PAR permissions:
    • PAR must have Write or Read and Write access
    • Read-only PARs will not work for exports
    • Recreate PAR with correct permissions
  4. Verify Object Storage bucket:
    • Bucket exists and is in the correct region
    • Bucket is not archived or deleted
    • Check bucket quota is not exceeded
  5. Network connectivity:
    • OTM server can reach Oracle Object Storage
    • No firewall blocking outbound HTTPS to Object Storage
    • Verify region-specific Object Storage endpoints are accessible

Data export timeout

Problem:

  • Export times out during sync
  • "Request timeout" error
  • Slow response from OTM

Solutions:

  1. Use async mode:
    • Switch from sync to async export mode
    • Async mode handles large datasets better
    • Set up Object Storage PAR URL
  2. Check OTM server load:
    • OTM may be experiencing high load
    • Try sync during off-peak hours (evening/weekend)
    • Contact Oracle support if server is consistently slow
  3. Reduce data volume:
    • Select fewer objects to sync
    • Use incremental sync instead of full sync
    • Add filters to reduce row count
  4. Check network latency:
    • Ping OTM server to check connectivity
    • High latency may cause timeouts
    • Consider using async mode to avoid timeout issues

Incremental sync not capturing updates

Problem:

  • Incremental sync misses recent updates
  • Full sync works but incremental doesn't
  • Data appears stale

Solutions:

  1. Check cursor field:
    • Verify object has UPDATE_DATE or LAST_UPDATE_DATE field
    • Cursor field must be populated for all records
    • Check sample records in OTM to verify timestamps
  2. Increase lookback time:
    • Set Incremental Lookback to 300-600 seconds
    • Helps catch late-arriving updates
    • OTM integration workflows may delay updates
  3. Verify server time offset:
    • OTM server clock may differ from Supaflow
    • Connector automatically calculates offset
    • Check sync logs for "server time offset" message
  4. Check for clock skew:
    • Large clock differences can cause missed data
    • Verify OTM server time is accurate
    • Contact Oracle administrator if server time is wrong
  5. Resync to verify:
    • Run a full resync to get all data
    • Compare with incremental sync results
    • Review sync state and cursor values in logs

High error count during sync

Problem:

  • Many rows fail to process
  • "Failed to process row" errors in logs
  • Sync completes but with warnings

Solutions:

  1. Check data quality:
    • OTM may have malformed data in some rows
    • Review error logs for specific row failures
    • Check if primary key values are present
  2. Verify field types:
    • Type mismatches can cause processing errors
    • Check if field values match expected types
    • Schema inference may have guessed wrong types
  3. Review permissions:
    • Some rows may have restricted access
    • User permissions may filter certain records
    • Verify user can access all required data
  4. Contact support:
    • Provide sync job ID and error logs
    • Include sample row data that fails
    • Email support@supa-flow.io with details

Schema refresh not working

Problem:

  • New objects not appearing
  • Schema seems outdated
  • Recently added fields missing

Solutions:

  1. Force schema refresh:
    • Set Schema Refresh Interval to 0
    • Save and test connection
    • Wait a few minutes for discovery to complete
  2. Check object permissions:
    • User may not have access to new objects
    • Verify permissions were updated in OTM
    • Check security policies haven't changed
  3. Verify object exists:
    • Log into OTM and check object is configured
    • Try exporting data manually via OTM UI
    • Ensure object is not disabled
  4. Clear cached schema:
    • Delete and recreate the source
    • Complete connection test again
    • New schema will be discovered

Support

Need help? Contact us at support@supa-flow.io