Common analysis issues and solutions

Common issues you may encounter in ArcGIS Velocity when performing analysis in real-time and big data analytics are described as well as recommended solutions. If you encounter an issue that's not addressed below, check the Esri Community and contact Esri Technical Support.

Note:

For issues and solutions related to accessing the Velocity application, signing in, or creating and starting items, see Troubleshoot.

Resolve the Could not find a refresh token for user: username error

ArcGIS Velocity sets a refresh token for each user the first time a feed or analytic is started. This refresh token is used to generate access tokens as needed to access online items. In some cases, there could be an issue with the refresh token such as if a user changes their password.

To resolve refresh token issues in the logs such as Could not find a refresh token for user or any other error related to refresh tokens, follow the steps below.

  1. In a web browser, open the ArcGIS Velocity app.
  2. Sign in with your ArcGIS Online credentials.
  3. From the ArcGIS Velocity Home page, click the information button next to Subscription Utilization.

    Click the information button to access the subscription information

  4. On the subscription information page, click Reset Refresh Token.

    ArcGIS Velocity reset refresh token

    A window appears to confirm you want to delete your current token and register a new one.

  5. Click Delete to delete the existing refresh token.

    A window appears confirming the existing token was deleted.

  6. Sign in again using the same ArcGIS Online account credentials to retrieve and register a new token.
  7. Click Close once a new token is successfully registered.
  8. Restart your feeds and analytics.

Data configuration

The following sections describe data configuration errors.

No time is defined or the operation requires the features to have time

Many tools—including track-based tools, temporal filters or joins, and tools performing time step analysis—require the input data to have time specified.

Time is specified on data by identifying the Start Time or End Time key fields. For more information on how to specify start and end times, see Configure input data.

If the Start Time or End Time fields are not identified, tools requiring features to have time will report an error such as Dataset is invalid: operation requires the features to have time or Dataset is invalid: no time is defined.

Inbound features do not have geometry specified

Many of the analytic tools in either a real-time or big data analytic require the input dataset have geometry specified. You can configure geometry for features in one of four ways:

If you receive this error message, you must either configure geometry in your feed or source configuration, or use a Calculate Fields or Map Fields tool to configure an Arcade expression that generates a geometry object.

At least one valid TRACK_ID field needs to be selected or Dataset is invalid: operation requires TRACK_ID tag set errors

Many track-based analytic tools such as Calculate Journeys, Calculate Motion Statistics, and others require the input data to have a Track ID key field specified.

A Track ID is a field in an incoming message or dataset that relates observations to specific entities. For example, a truck might be identified by its license plate number or an aircraft by an assigned flight number. These identifiers can be used as Track IDs to track the events associated with a particular real-world entity or set of incidents. A Track ID is specified as part of a feed or data source schema. For more information, see Configure input data.

If a Track ID is not specified, track-based analytic tools will report an error such as At least one valid TRACK_ID field needs to be selected.

Big data analytics

The following sections describe big data analytics errors.

Recurring big data analytic task took longer to run than the set recurrence interval. The next job will be skipped.

Big data analytics can be scheduled and configured to run periodically, run periodically within a time frame, or to run at a recurring time.

This message will be written to the logs if an analytic is still running at the time of another scheduled run, for example, if an analytic is configured to run every 1 minute but took roughly 3 minutes to complete.

If this message appears frequently in the logs of a big data analytic, address this by either allocating additional resources in the run settings so the analytic completes faster or adjust the recurrence interval so the analytic can finish before the next scheduled run.

The recurring big data analytic failed because insufficient resources were available in the Velocity environment.

When a big data analytic is scheduled to run at a certain time or to recur repeatedly, sufficient resources must be available in the Velocity environment to fulfill each run at the time it runs.

If the Velocity environment does not have sufficient resources for the recurring big data analytic to perform its processing at the time of the run, each run will be skipped until sufficient resources are available. Velocity will keep attempting to run the big data analytic each time it is scheduled to run or recur.

Analytic failed, log message encountered: "Big data analytic task {id} ran out of memory during execution. Adjust the configuration to avoid failures."

When certain tools are present in an analytic and large volumes of data are being processed, additional run settings resources may need to be provided to ensure a successful run. If you encounter this log message, adjust the analytic run settings and change from a default plan to a Large or Extra Large plan.

If you increased your run settings resources and are still encountering this error, contact Esri Technical Support.

Analytic failed, status ToolTip or log message encountered: "The analytic failed with the reason OOMKilled (out of memory killed) for the driver pod. Please increase run setting resources and run again."

When certain tools are present in an analytic and large volumes of data are being processed, additional run settings resources may need to be provided to ensure a successful run. If you encounter this log message, adjust the analytic run settings and change from a default plan to a Large or Extra Large plan.

If you increased your run settings resources and are still encountering this error, contact Esri Technical Support.

Future enhancements will automatically increase the relevant resources to ensure a successful analytic run as part of a retry policy.

Coordinate systems

The following sections describe coordinate system errors.

The input dataset is in a geographic coordinate system, but this operation requires a projected coordinate system.

Several analytic tools only work with data in either a geographic or projected coordinate system. For example, the Calculate Density tool can only process data that is in a projected coordinate system. The solution is to configure the Project tool prior, which will project the data to a projected coordinate system such as well-known ID (WKID) 102100, which refers to WGS 1984 Web Mercator Auxiliary Sphere.

Planar and geodesic spatial relationships

When configuring spatial relationships such as near planar or near geodesic in analytic tools, you must choose the correct spatial relationship respective to the coordinate system of the input data for the tool.

Specifically when configuring near planar spatial relationships, the spatial reference of both the target and join datasets must be a projected coordinate system. To address this, either use a near geodesic spatial relationship or use the Project tool to alter the target or join datasets to be in a projected coordinate system.