Bug report #845
Problem with large attribute tables
|Affected QGIS version:||Regression?:||No|
|Operating System:||Debian||Easy fix?:||No|
|Pull Request or Patch supplied:||Resolution:||fixed|
|Crashes QGIS or corrupts data:||Copied to github as #:||10904|
I tried to open the database table of a very large shape file.
dbf -> 770 MB
shp -> 384 MB
shx -> 7 MB
Loading and visualizing the file works fine, although it takes some minutes, but opening the attribute table fails.
The error message is:
QIconvCodec::convertToUnicode: using ASCII for conversion, iconv_open failed
terminate called after throwing an instance of 'std::bad_alloc'
Abgebrochen (core dumped)
When I generate a stacktrace, all it says is:
#0 0xffffe410 in ?? ()
Cannot access memory at address 0xbf8e8520
- Comment by Marco Hugentober:
The problem is that you run out of virtual memory (RAM and
swap partition are full) and so the bad_alloc exception is
thrown.In the short run we should add a try/catch block and in the long run not read all the rows into memory.
#2 Updated by Otto Dassau about 14 years ago
- comment by Tim Sutton
This is really a side effect of ticket
The table display needs to be redesigned to use qt interview
framework. I have some in progress work to do this, and martin has been doing some similar work in python.
#5 Updated by Neil Robinson - over 12 years ago
- Resolution deleted (
- Status changed from Closed to Feedback
Opening a large attribute table still fails for me. In fact, if I leave it long enough without killing the qgis process, it hangs my whole system.
Using 1.0.2 on Ubuntu 9.04 -- downloaded from ppa repositories.
Is there some way to get debugging info that I could send through to help with troubleshooting this?