Feature request #7369
Size limit when using postgis data source
|Pull Request or Patch supplied:||No||Resolution:|
|Easy fix?:||No||Copied to github as #:||16342|
My layer that I want to run IDW on in postgis is 3.6G. Sextante trying to write this out to a temp shapefile on the system before passing to SAGA is a huge problem that will never work because the dbf table exceeds the 2 G limit. I let it run for 8 hours and if never seems to finish even making the shp. So this may be a SAGA bug for requiring shp input but we need to find a way to allow passing large data to analysis algorithms with a memory provider or using postgis directly. I have 32 G of RAM and am on 64 bit Ubuntu.
#2 Updated by Alex Mandel about 7 years ago
Victor Olaya wrote:
I am afraid there is nothing we can do for that. SAGA supports those formats, and it is a limitation of the software. The only solution is to develope new things in saga...
What about warning users that an operation may not finish, or that it failed because the layer is too big for shp. Mine just sat locked up all night.