Bug report #9431
Crash When Using Relationships with Large PostGIS Tables
|Affected QGIS version:||master||Regression?:||No|
|Operating System:||Easy fix?:||No|
|Pull Request or Patch supplied:||No||Resolution:|
|Crashes QGIS or corrupts data:||Yes||Copied to github as #:||18025|
Does this feature support large relationship tables from a database? I tried to build a relationship between PostgreSQL tables where the related table has about 2 million rows (which has foreign key column indexed). However when I tried to do a feature info on the map canvas QGIS hung for 20mins and then crashed. Does it try to fetch and cache all of the relationship data upfront, rather than do a database index look-up each time?
Note relationship fields are defined as int4 in PostgreSQL.
#1 Updated by Andreas Neumann over 7 years ago
Jeremy - I am not the developer of this feature. But yes - currently, the tables are all fetched to the client and then the relationship is built on the client. This is necessary because QGIS supports also file-based data sources.
However, Matthias, the author of this feature plans to do a QGIS expression to database-expression compiler that should be able to do the filtering in the database. You will have to be patient and wait a bit though. It won't be released with QGIS 2.2.
#3 Updated by Matthias Kuhn over 7 years ago
- Status changed from Open to Feedback
- Affected QGIS version changed from 2.0.1 to master
It would be very helpful if you could attach a stacktrace of your crash and check how the memory consumption behaves before the crash, so we could try to figure out, where exactly the problem is.
It indeed does not (yet) make use of indexes, and instead iterates over all the features and filters them locally. I plan to integrate such a functionality in future releases and will be looking for funding for this (probably starting around May). If you are considering to help funding such a development, please contact me via email.
PS: I assume it affects master (this feature was not included in 2.0.1).