When performing bulk data ingestion using theBatch Ingestion API, the most critical constraint that must be defined is theData Schema. Adobe Experience Platform is built on the principle ofExperience Data Model (XDM)compliance. Every batch created must be associated with a specificDataset, which in turn is strictly bound to an XDM Schema.
When a developer initiates a "Create Batch" request, the platform requires the datasetId. This ID ensures that all incoming records in the payload are validated against the structure, data types, and mandatory fields defined in the schema. This constraint is fundamental to maintaining data integrity within theData Lakeand ensures that theReal-Time Customer Profileservice can correctly ingest and merge the fragments. Options like Batch Size and Batch Frequency are typically environmental or orchestration settings rather than constraints defined within the API's batch creation payload itself. By enforcing theData Schemaconstraint at the ingestion point, Adobe CDP prevents "dirty data" from entering the system, ensuring that segmentation and activation services can rely on a standardized and predictable data structure across all sources.
Contribute your Thoughts:
Chosen Answer:
This is a voting comment (?). You can switch to a simple comment. It is better to Upvote an existing comment if you don't have anything to add.
Submit