The Update fields tool updates a field name or field type.
Examples
The Update fields tool can be used in scenarios such as the following:
- A dataset has fields with uninformative names. Update the fields to make the field names more meaningful.
- A ZIP Code field was detected as type integer, but your workflow requires the ZIP Code values to be strings. Update the field type from integer to string.
Parameters
The following table outlines the parameters used in the Update fields tool:
Parameter | Description |
---|---|
Input dataset | The dataset containing the fields that will be updated. |
Updates | A list of one or more fields with the updates that will be made to each field. |
Field to update | The field that will be updated. |
New field name | The new name of the updated field. |
New field type | The new type of the updated field. |
Decimal separator | Specify a decimal separator value if you are casting string values to a number and the string values do not use a period (.) as the decimal separator. This parameter is optional. |
Usage notes
Use the Input dataset parameter to identify the dataset containing the fields that will be updated.
Provide the field to update using the Field to update parameter. Click the Add button to update more than one field.
You can update the field name, type, or a combination of the two. At least one update is required for each field. The options are as follows:
- New field name—Updates the name of the specified field.
- New field type—Updates the type of the specified field.
The following are the field type options:
- Boolean—Outputs a Boolean type field. Boolean fields support values of true and false.
- Double—Outputs a double type field. Double fields support fractional numbers.
- Integer—Outputs an integer type field. Integer fields support whole numbers.
- String—Outputs a string type field. String fields support strings of characters (text).
Use the Decimal separator parameter to cast string fields to a double if the strings use decimal separators that are not periods (.). This parameter is optional. The following are examples of how to use the decimal separator parameter:
- A string field contains double values that use a comma as the decimal separator such as "3,55". To convert this field to a double, specify a new field type of double and comma (,) as the decimal separator. The result will be a double field with values such as 3.55.
- A string field contains integer values with extra zero padding such as "5,0000". To convert this field to an integer, specify a new field type of integer and comma (,) as the decimal separator. The result will be an integer field with values such as 5.
For all numeric fields, preview will format the number according to the locale set in your user settings.
Casting behavior
The Update fields tool handles casting behavior differently depending on the input type and target type. Supported casts will either succeed or return a null value if the original value cannot be cast to the target type. If the cast is not supported, the data pipeline will fail with an error. The following table details whether the cast is supported and what the casted value's format is if applicable:
String | Integer | Double | Boolean | |
---|---|---|---|---|
String | N/A | Yes (Round) | Yes | Yes |
Small integer | Yes | Yes | Yes | Yes |
Integer | Yes | N/A | Yes | Yes |
Big integer | Yes | Yes | Yes | Yes |
Float | Yes | Yes (Round) | Yes | Yes |
Double | Yes | Yes (Round) | N/A | Yes |
Date only | Yes (ISO 8601) | No | No | No |
Date | Yes (ISO 8601) | Yes (Seconds) | Yes (Seconds) | No |
Boolean | Yes ("true", "false") | Yes | Yes | N/A |
Blob | Yes | No | No | No |
Array | Yes (JSON) | No | No | No |
Map | Yes (JSON) | No | No | No |
Struct | Yes (EsriJSON) | No | No | No |
Point | Yes (EsriJSON) | No | No | No |
Multipoint | Yes (EsriJSON) | No | No | No |
Polyline | Yes (EsriJSON) | No | No | No |
Polygon | Yes (EsriJSON) | No | No | No |
Outputs
The tool output contains the input dataset with the newly updated fields.
Licensing requirements
The following licensing and configurations are required:
- Creator or Professional user type
- Publisher, Facilitator, or Administrator role, or an equivalent custom role
To learn more about Data Pipelines requirements, see Requirements.