Feature request #7369
Size limit when using postgis data source
Status: | Open | ||
---|---|---|---|
Priority: | Normal | ||
Assignee: | Victor Olaya | ||
Category: | Processing/SAGA | ||
Pull Request or Patch supplied: | No | Resolution: | |
Easy fix?: | No | Copied to github as #: | 16342 |
Description
My layer that I want to run IDW on in postgis is 3.6G. Sextante trying to write this out to a temp shapefile on the system before passing to SAGA is a huge problem that will never work because the dbf table exceeds the 2 G limit. I let it run for 8 hours and if never seems to finish even making the shp. So this may be a SAGA bug for requiring shp input but we need to find a way to allow passing large data to analysis algorithms with a memory provider or using postgis directly. I have 32 G of RAM and am on 64 bit Ubuntu.
Related issues
History
#1 Updated by Victor Olaya over 11 years ago
- Status changed from Open to Feedback
I am afraid there is nothing we can do for that. SAGA supports those formats, and it is a limitation of the software. The only solution is to develope new things in saga...
#2 Updated by Alex Mandel over 11 years ago
Victor Olaya wrote:
I am afraid there is nothing we can do for that. SAGA supports those formats, and it is a limitation of the software. The only solution is to develope new things in saga...
What about warning users that an operation may not finish, or that it failed because the layer is too big for shp. Mine just sat locked up all night.
#3 Updated by Giovanni Manghi over 10 years ago
- Status changed from Feedback to Open
#4 Updated by Giovanni Manghi about 10 years ago
- Project changed from 78 to QGIS Application
- Category deleted (
56) - Crashes QGIS or corrupts data set to No
- Affected QGIS version set to 2.4.0
#5 Updated by Giovanni Manghi about 10 years ago
- Category set to Processing/SAGA
#6 Updated by Giovanni Manghi over 9 years ago
- Tracker changed from Bug report to Feature request
#7 Updated by Giovanni Manghi over 7 years ago
- Easy fix? set to No