Direct Data Intake API Overview

Learn more about the Direct Data Intake API.

The Direct Data Intake (DDI) API treats Visier objects as direct data targets. As direct data targets, objects have:

  • Data load schema: The structure through which the object receives data. The data load schema is also known as the staging schema.
  • Analytic schema: The information about the object that you can query. You can retrieve an object's analytic schema using the Data Model API. For more information, see Data Model API.

Tip:  

In the API, you can push CSV files with the correct data schema to a target object to load that object with data. This API doesn’t require knowledge of the object’s underlying technologies, such as sources and mapping. However, this API requires that you apply any required transformation to your data prior to using the API.

This API may be right for your organization if you want to integrate Visier into a pre-existing ETL (extract, transform, load) process where your data is at a suitable level of granularity and quality. If your data isn't suitable, for example, not organized into files with column headings that exactly match Visier's data load schema, your data must be prepared prior to using the API.

Your data file must meet the following requirements:

  • The data file columns match the data target's load schema.
  • Mandatory columns contain values.
  • The source data format is a transactional profile.

    Note: In transactional profile data, the data excludes any unchanged records and includes each changed record in its entirety. Each row is included if one or more attributes have changed. The time associated with the change is included in the record.

The DDI API processes files in the following steps:

  • Update the data intake configuration (Optional): If your Visier tenant contains existing data for any object before using the DDI API, you must change the data intake data category to supplemental.
    • Primary: The Visier tenant doesn't have an existing data category, that is, the tenant doesn't contain customer data yet. After you commit a transaction for the first time, the DDI data category is the primary data category. Primary is the default configuration.
    • Supplemental: The Visier tenant contains customer data that was loaded through a method other than the DDI API. With Supplemental mode, you can use the DDI API to load data for objects that don't yet contain data.
    • Extension: The Visier tenant contains customer data that was loaded through a method other than the DDI API and you want to load additional data for already-loaded objects. To use Extension mode, your data load actions must meet the following criteria:
      • The object is primarily loaded through other data transfer methods (not DDI API).
      • The data transfer methods, DDI API and otherwise, must not load the same properties on the object. For example, you cannot load EmployeeID with SFTP and then try to load EmployeeID with the DDI API.
  • Retrieve an object's data load schema: Discover the data load schema associated with the data target that you want to load. For example, you can call the API to find the data load schema for Employee_Exit.
  • Start a data intake transaction: Create a transaction to contain your data files. You can upload data files for multiple data targets in a single transaction. For example, one transaction can hold data files for Employee_Exit, Employee, and Applicant if you want to upload data for those objects.
  • Upload files: Send data files to a previously-created transaction. The files are not processed in Visier until you commit the transaction. You can upload multiple files for the same target object within a transaction, but must send one file per API call.
  • Commit a transaction: Process a transaction and its uploaded data files. This starts a processing job to load the data files into Visier. After committing a transaction, you cannot upload additional files to the transaction.
  • Check transaction status (Optional): Retrieve the status of a committed transaction's processing job.
  • Roll back a transaction (Optional): If you want to cancel a transaction due to uploading incorrect data files or other issues, you can roll back the transaction. After rolling back a transaction, the transaction cannot be used again. To continue, you must start a new transaction.

    Note: Committed transactions cannot be rolled back.