Add a cloud storage or database data store item

Data store items in ArcGIS Online are used to access data for ArcGIS Data Pipelines.

When you use the Data Pipelines editor to add a dataset from a cloud storage location or database, a data store item is created in your content. You can use this workflow to create a data store item; however, the Data Pipelines editor uses credits when you're connected. Alternatively, you can add a data store item from the My content tab of the Content page in ArcGIS Online, which does not use credits.

Add a data store item

The role assigned to your account must have the following privileges, at minimum, to create a cloud storage or database data store item in ArcGIS Online:

  • Create and run data pipelines
  • Create, update, delete (content)
  • Publish hosted feature layers

To add a data store item from My content, complete the following steps:

  1. Verify that you are signed in to your organization.
  2. On the My content tab of the Content page, click New item, and click Data store.
  3. Choose the type of data store to add.
    • Database—When using this option, choose the cloud data warehouse to access: BigQuery or Snowflake.
    • Cloud Storage—When using this option, choose the BLOB storage location in the cloud that contains the data: Amazon S3 or Microsoft Azure Storage.
  4. Click Next.
  5. The information that you must provide for the connection depends on the data location you chose in step 3.
  6. To connect to Google BigQuery, use service authentication and provide credentials and the data location by doing the following:
    1. Click Select key file, browse to and choose the key file (.json) that contains the credentials required to access the data warehouse.
    2. In the Project id field, provide the identifier of the BigQuery project that contains the data you need.

    For information about how content is used and limitations, consult the ArcGIS Data Pipelines help.

  7. To connect to Snowflake, use user authentication, provide credentials, and identify the location of the data by doing the following:
    1. In the Server field, provide the URL to access your Snowflake account.
    2. Type your Snowflake account username and password in the respective fields.
    3. In the Database field, provide the name of the Snowflake database that contains the data.
    4. In the Schema field, provide the name of the schema that contains the data you need.
    5. In the Warehouse field, provide the name of the warehouse that will provide compute resources when connecting.
    6. In the Role field, provide the name of the role that will confer privileges for the connection.

      The role must have at least USAGE and CREATE STAGE privileges on the schema you specified in step d.

    For information about how content is used and limitations, consult the ArcGIS Data Pipelines help.

  8. To access content in an Amazon Simple Storage Service (S3) bucket, provide an access key and specify the data location by doing the following:
    1. In the Access key field, provide the access key ID for your Amazon Web Services (AWS) account.
    2. In the Secret key field, provide the secret access key for the access key ID you provided in the previous step.
    3. From the Region drop-down menu, choose the AWS region where the S3 bucket resides.
    4. In the Bucket name field, provide the name of the S3 bucket.
    5. Optionally, in the Folder field, provide the path and subfolder name to directly access a specific subfolder in the bucket.

      If you don't provide a folder name, the data store accesses the root bucket location.

    For information about how content is used and limitations, consult the ArcGIS Data Pipelines help.

  9. To access content in a Microsoft Azure container, provide the data location and either shared key or shared access signature authentication credentials by doing the following:
    1. Choose one of the following from the Authentication type drop-down menu and provide appropriate authentication information:

      • Shared Key—Provide your Microsoft Azure account name and account key.
      • Shared Access Signature—Provide your Microsoft Azure account name and a shared access signature token.

    2. Choose one of the following from the Storage domain drop-down menu:

      • Azure Cloud—Choose this option if the container is in the commercial Microsoft Azure cloud.
      • Azure US Government—Choose this option if the container is in the Microsoft Azure United States government cloud.
      • Other—Choose this option if you deployed the container in a custom domain. Provide the Azure Blob Storage endpoint in the Endpoint field.

    3. In the Container name field, provide the name of the Azure Blob Storage container.
    4. Optionally, in the Folder field, provide the path and subfolder name to directly access a specific subfolder in the container.

      If you don't provide a folder name, the data store accesses the root container location.

    For information about how content is used and limitations, consult the ArcGIS Data Pipelines help.

  10. When you finish providing connection information, click Next.
  11. Provide a title.
  12. Choose a folder in My content where you want to save the item.

    Alternatively, you can choose Create new folder from the menu and type a folder name to save the item in a new folder.

    Folder names cannot contain 4-byte Unicode characters.

  13. If the organization administrator has configured content categories, click Assign Category and select up to 20 categories to help people find the item.

    You can also use the Filter categories box to narrow the list of categories.

  14. Optionally, type tags that describe the item.

    Separate the terms with commas (for example, Federal land is considered one tag, and Federal, land is considered two tags).

    As you type, you can select any of the suggested tags that appear; suggestions are generated from tags you previously added.

  15. Optionally, provide a summary that describes the item.
  16. Click Create connection.

The data store item is created and the item page opens.


In this topic
  1. Add a data store item