Skip to end of metadata
Go to start of metadata

Batch Segment Service Best Practices 

The Batch Segment Service allows you to send bulk feeds of external segment data to AppNexus. Using Batch Segment, you can easily integrate your first-, second-, or third-party audience data to create dynamic, configurable segments for sophisticated user targeting and increased ROI.

On this page


Best Practices

  1. Read the FAQs below. Chances are your question is answered there.
  2. Keep your cookie mapping fresh and match rate high by collecting UIDs as frequently as possible. AppNexus expires UIDs on a rolling basis such that the user IDs who haven't been seen for the longest period of time are removed from our database to make room for new user IDs. If you upload UIDs that are very old, chances are you will have a high percentage of invalid user IDs in your status report.
  3. Batch your files as much as possible. Your data will be uploaded more quickly in a few large files rather than a large amount of very small files.
  4. Dedupe your files. You'll be able to keep the size and processing time down by removing any duplicate user/segment combinations from your files.
  5. Compress your files. Gzip is the only compression method supported by this service. By compressing your files, you'll be able to upload more data more quickly.
  6. Avoid sending the full segment membership in every file. Instead, you only need to send the changes in segment membership since the last upload. This will greatly reduce the size of your files and speed the upload process.
  7. Use the upload URL immediately. After requesting an upload URL, be sure to use it within 5 minutes or you'll need to request a new URL. You cannot reuse old URLs.
  8. Avoid uploading your largest files during peak hours (10am-10pm EST in the US, 10am-10pm CET in Europe):During peak hours, there are more files in the queue and your file may take longer to be processed. If you can, schedule your uploads to happen overnight. Per the SLA, uploads could take up to 24 hours.
  9. Check the status report. Checking the status report will allow you to catch and correct errors and upload more accurate data.
  10. Confirm it worked. After your data has been processed, you can spot check a few user IDs using the cookie viewer (customer login required to view this link). If you don't see the segments in one geography, check the other - chances are your user is in the other geo.
  11. Include the job ID in support requests. If you're experiencing technical issues, don't forget to include all relevant job IDs when submitting your request to

Be sure to wait approximately 20 minutes before trying to add users to any newly created segments (to allow these segments to be propagated to all servers in our cloud). In addition, as a best practice, try to minimize the creation of new segments, re-use existing segments where possible or use segment "values" to further sub-divide users within existing segments. These practices will ensure successful user uploads to segments. For details on creating segment "values", see Segment Pixels Advanced and Segment Targeting.


 How do I know if my uploads are successful?

There are a few ways to confirm that your uploads are successful:

1) When checking the status of your file, you will receive a status report that contains useful parameters:

    • num_valid: the number of valid user-segment pairs in the file.
    • num_valid_user: The total number of valid user IDs in this file.
    • num_invalid_user: The total number of invalid user IDs in this file.
    • segment_log_lines: The number of unique valid users added to each segment. The format is [SEGMENT ID]-[NUMBER OF USERS]. Note that this field is limited to 999 lines in your file. If you have more than 999 segments in the job, some of them will not be shown.

2) Use the Segment Inventory Overlap report to gauge how many impressions you can expect to see by segment, seller, and country. Note that you'll need to wait a few days after creating new segments before data will show up in this report. This report is currently only available to clients with console accounts (i.e. it is not available to data providers)

3) Spot check a few user IDs to confirm they have been added to the appropriate segments using the below URLs. If you don't see the user in one geo, be sure to check the other geo.


4) Use the Segment Loads API report (or Segment Load Reporting in Console) to see the number of valid user IDs that have been loaded into a segment over a given time period.

 How long after my upload until the data is available for targeting?

 Your segment data begins to be available for targeting as soon as your job hits the 'validating' phase. The data is fully ingested and ready for targeting once your job has hit 'completed.' The time until your job hits 'completed' depends on the size of your file and the number of other files in the queue.

Allow up to 24 hours for uploads to process (per the AppNexus SLA).

  Will there be a "fired" timestamp on these users for segment age targeting?

 If you choose to include the TIMESTAMP field in your files, then you'll specify the timestamp for every user/segment combination. If you do not include TIMESTAMP, then the timestamp will be the time at which the data was written to our server-side cookie store, which is usually around the "completed time" you see in your status report. Note that this time is in the UTC time zone.

 What if I have some users in the US and some in Europe?

 AppNexus maintains three separate cookie stores in the US, Europe, and Asia; however, you do not need to upload separate files for each geo. The Batch Segment system will automatically determine which geo the user is in and send the data to that cookie store database. If the user is in multiple geos, the data will be sent to each geo the user exists in.

  Why am I limited to only 999 error lines?

 The 999 error lines provided are not meant to be a comprehensive list of all errors. Instead, they should be used as a diagnostic tool to diagnose problems with your file and then correct them.

  Why are some jobs are missing when I view my entire job history?

When you view your complete file upload history, every single file you've ever uploaded will be included. However, our API limits responses to 100 objects via pagination. You can view additional objects by appending one of these to the API call:

You can read more about pagination on our wiki here.

 A certain percentage of my users are invalid. What's up with that?

There are several reasons a user ID can be invalid:

  1. If there is a typo in the UID, it's invalid.
  2. AppNexus expires UIDs on a rolling basis such that the users who haven't been seen for the longest period of time are removed from our database to make room for new users. If you upload UIDs that are very old, chances are you will have a high percentage of invalid users in your file. It's a best practice to keep your cookie mapping fresh and match rate high by collecting UIDs as frequently as possible.
  3. If a client syncs a user AppNexus has never seen before, AppNexus (via getUID) performs a bounce to the page, sets a UID, and returns that UID back to the client.  However, AppNexus does not yet store the user ID server side until we see the user a second time. Thus, for all the users we've seen only once from a getUID call, when their UIDs are pushed to us server side via the Batch Segment upload, you will see invalid user ID errors.  These IDs exist in the user's cookie, but not in our server side data store.
 What can I do to decrease the number of invalid users in my files?

Drop the user sync pixel more frequently, and expire old users out of your database after a period of inactivity.

(These recommendations only hold true for display. For mobile, since user syncing is not a possibility, device IDs will be valid if we have previously seen them coming in through mobile supply requests.)

 If I include the VALUE field, do I have to list a value for every single user/segment combination?

Yes, you do. However, if you do not wish to assign a value, you can enter 0 as the value.

 How often can I push batch segment uploads?

Our systems can handle a maximum of one upload per minute (per member). Any upload interval larger than one minute will work properly.

 Should I compress upload files before submitting them through the API Batch Segment Service?

Yes! As a best practice you should always compress upload files before pushing them through the Batch Segment Service. The gzip compression standard is supported.

  Should I always push my complete audiences every time I upload to segments through the API Batch Segment Service?

As a best practice you should only upload full audiences through the API Batch Segment Service when you are uploading the audience for the first time or when the audience has expired from its respective segment. Besides those two cases you should always upload modifications (deltas) to your existing audiences; Batch Segment uploads do not refresh the time to live of cookies in the AppNexus cookie store (they can only refresh the time that the cookies are associated to a particular set of segments).

  Should I sort the UIDs in my batch segment upload files?

Sorting your file by user ID helps our system process your file more quickly.

 I don't think this is working. Can you help?

Sure! Submit a case through the Customer Support Portal. Note that while the job status reports are kept indefinitely, we only save the actual file of data for 3 days. Please be sure to submit your case as soon as possible after uploading so that we can look at the data you uploaded. 

Related Topics