Skip to content

Commit 8f1021c

Browse files
committedNov 19, 2017
Disable WMS server test
1 parent af6b4cc commit 8f1021c

File tree

1 file changed

+1
-0
lines changed

1 file changed

+1
-0
lines changed
 

‎.ci/travis/linux/blacklist.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,7 @@ PyQgsSpatialiteProvider
2929

3030
# Flaky, see https://travis-ci.org/qgis/QGIS/jobs/297708174
3131
PyQgsServerAccessControl
32+
PyQgsServerWMS
3233

3334
# Need a local postgres installation
3435
PyQgsAuthManagerPKIPostgresTest

4 commit comments

Comments
 (4)

m-kuhn commented on Nov 19, 2017

@m-kuhn
MemberAuthor

I had to disable this test as it was becoming really unstable recently. I'm sorry, I'm not able to fix it myself and it's currently giving many false alarms.

Can someone take over to re-enable (parts of) it?

It looks like it has been failing repeatedly since #5664 (See https://travis-ci.org/qgis/QGIS/builds) but since the PR itself was green and it was already failing quite often before, I'm not convinced, there's a connection between PR and test results.

Thanks a lot

@elpaso @pblottiere @rldhont

pblottiere commented on Nov 20, 2017

@pblottiere
Member

Argh... I thought that the commit of Even on Spatialite provider (581d0d3) had stabilized these tests...

BTW, according to Travis, it seems that it's a real error this time (and not some flakyness):

======================================================================

FAIL: test_wms_GetProjectSettings_wms_print_layers (__main__.TestQgsServerWMS)

----------------------------------------------------------------------

Traceback (most recent call last):

  File "/root/QGIS/tests/src/python/test_qgsserver_wms.py", line 1964, in test_wms_GetProjectSettings_wms_print_layers

    self.assertTrue(xmlResult.find("<WMSBackgroundLayer>1</WMSBackgroundLayer>") != -1)

AssertionError: False is not true

I take a look to fix it. Moreover, I'm gonna split these WMS tests for getmap, getprint, and so on. This way, we'll be able to deactivate only some tests (and not all of them).

elpaso commented on Nov 20, 2017

@elpaso
Contributor

@pblottiere no: unfortunately the test was faling almost all the times. I had a look yesterday and I found mainly tiny rendering differences, can we increase the acceptable difference threshold ?

pblottiere commented on Nov 20, 2017

@pblottiere
Member

no: unfortunately the test was faling almost all the time

:(. Can I take a look to your travis builds somewhere?

I found mainly tiny rendering differences, can we increase the acceptable difference threshold ?

OK. I'm gonna do that at the same time.

We really have to tackle these issues, it's annoying for everybody. But I didn't succeed in reproducing these issues locally so far...

Please sign in to comment.