Feature request #5546

Add a warning when vector fields/values are above the limits

Added by TJ Maciak almost 12 years ago. Updated almost 7 years ago.

Status:Open
Priority:Normal
Assignee:-
Category:Vectors
Pull Request or Patch supplied:No Resolution:
Easy fix?:No Copied to github as #:15139

Description

We converted a shapefile of neighborhoods from 4326 to 3857 epsg that contained a field called GEOID that was 13 characters in length (example: 2634000031001). When it was converted to 3857 they all turned into -2147483648. When discussing this with user lynxlynxlynx on #qgis channel he suggested this was some type of LONG_MIN overflow error.

This was a problem in both QGIS 1.5 (mac) and 1.75 (Windows) versions.

To try to resolve this we first attempted to change the column name to something else thinking GEOID might be a reserved name. When changing to GEOID10 same result was still observed.

Next we tried to change the column type using the plugin "Table Manager" from REAL to STRING but we were unable to do so. Could not find any other ways within QGIS to change the column type.

Our workaround was to go back to our gis analyst and have him save the shape file GEOID as a "STRING". Then when conversion took place to 3857 GEOID was correct.

My suggestion for improving QGIS would be 2 fold:

1) Give the user an error message if any of the attributes do not equal in the destination file what was in the source file
2) Make a way for the user to change the column types on the fly during reprojection (or prior to reprojection), so we wouldn't have to go back to a gis analyst for updating the shape file.

I am including a sample shape file zipped up so that you can reproduce this error. If you need further clarification please feel free to contact me.

Thanks,

TJ Maciak

QGIS-test-shapefile.zip (14.4 KB) TJ Maciak, 2012-05-07 09:26 AM

Test_AttrFieldDefinitions_Point.shp.zip - [shapefile created with QGIS] (1.95 KB) Christine Schmidt, 2012-05-28 02:20 PM

Test_AttrFieldDefinitions_Point_AGcopy.shp.zip - [copy of the shapefile above, performed with ArcGIS; the field created with REAL, length 20, precision 5 was changed to length 19, precision 5] (2.31 KB) Christine Schmidt, 2012-05-28 02:20 PM


Related issues

Related to QGIS Application - Bug report #5173: Shape file field precision includes the . when saving, bu... Closed 2012-03-13
Duplicated by QGIS Application - Bug report #8430: Field Calculator - area failing - overflowing Closed 2013-08-08

History

#1 Updated by Giovanni Manghi almost 12 years ago

To change the column datatype of a shapefile you need to use the QGIS field calculator. Actually the normal operation is to "clone" one column in another one of a different datatype.

#2 Updated by TJ Maciak almost 12 years ago

We did try to clone the column using the "Table Manager" plugin (in 1.74) but was not given an option to change the type. Is there a better way to clone it that you can tell me how to do? thank you :)

#3 Updated by Giovanni Manghi almost 12 years ago

  • Category changed from Projection Support to Vectors
  • Subject changed from Reproject shape file from 4326 to 3857 worked, but attribute data was not converted correctly to saving shapefile to another CRS or cloning a column causes column values corruption

New description:
pick the attached shapefile. Clone the column "GEOID" to another one or just re-save the shape into another one, with the same CRS or another one, and the original values are changed to "-2147483648".

Note1: DBF specifications says that the integer column is of int4 type (values between -2147483648 to +2147483647) so a dbf column with a value of 2634000031005 seems not respectful of the specifications and I guess can cause issues. QGIS should probably handle better this situation but after all seems not its fault.

See for example here: http://www.postgresql.org/docs/8.1/static/datatype.html

If you need a such large number you must use something more advanced, like spatialite or postgis.

I would vote to close this as invalid and file a ticket to ask a better handling (with warnings) of dbf limits.

#4 Updated by Giovanni Manghi almost 12 years ago

  • Status changed from Open to Feedback
  • Priority changed from Normal to Low

#5 Updated by Giovanni Manghi almost 12 years ago

TJ Maciak wrote:

Is there a better way to clone it that you can tell me how to do?

I already told you, the QGIS field calculator :)

#6 Updated by Giovanni Manghi almost 12 years ago

  • Subject changed from saving shapefile to another CRS or cloning a column causes column values corruption to Add a warning when vector fields/values are above the limits
  • Target version set to Version 2.0.0
  • Priority changed from Low to Normal
  • Status changed from Feedback to Open
  • Tracker changed from Bug report to Feature request

New description:

Is not unusual to find shapefiles with field values that are above the limits for the column datatype.

When doing a whatever operation with such vector, the result in that columns is "wrong".

I believe that is necessary to add a warning somewhere to let the user know what is going on.

#7 Updated by Christine Schmidt almost 12 years ago

We also had a problem recently with a shapefile in QGIS. A field of this shapefile was defined with REAL; length 19 and precision 0. The field contained an object_id. An attempt to save/export selected features of the shapefile let all the object_id fields be filled with -2147483648 values.

Did some testing with that field definition (Real, length 19, precision 0) and only the range of values between 2147483647 and -2147483647 seems to be insertable. Beyond that, -2147483648 appears in the fields after saving the table. The object_id in the original shapefile exceeded the mentioned value range. This happened in a field defined as REAL, not INTEGER.

In a field of type REAL defined with length=10 and precision=5 in the same shapefile could be entered a value of 999999999,9999.

Observed also an absurdity with field definitions: e.g. a field definition created in QGIS with REAL, length 10, precision 5, is read by ArcGIS and gvSIG as REAL, length 10, precision 6. That's only a slight difference, but the value should be read the same in all programms, isn't it? By the way: fields defined in QGIS as Integer were read as Double (with scale 0) in ArcGIS. Is that supposed to be normal?

Have included two sample shapefiles.
Test_AttrFieldDefinitions_Point.shp [shapefile created with QGIS]
Test_AttrFieldDefinitions_Point_copy.shp [copy of the shapefile above, performed with ArcGIS; the field created with REAL, length 20, precision 5 was changed to length 19, precision 5]

Would propose a new title "Issue with attribute field definition in a shapefile". A warning for users, if they try to create fields beyond the limits, would be really good.

#8 Updated by Alister Hood over 11 years ago

Observed also an absurdity with field definitions: e.g. a field definition created in QGIS with REAL, length 10, precision 5, is read by ArcGIS and gvSIG as REAL, length 10, precision 6.

Also see #5173

#9 Updated by Pirmin Kalberer over 11 years ago

  • Target version changed from Version 2.0.0 to Future Release - Nice to have

#10 Updated by Giovanni Manghi over 10 years ago

see also #8430

#11 Updated by Giovanni Manghi almost 7 years ago

  • Easy fix? set to No

Also available in: Atom PDF