Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add travis-ci, run tests with phantomjs #303

Open
wants to merge 5 commits into
base: master
Choose a base branch
from

Conversation

ericblade
Copy link
Collaborator

needs some additional work and discussion before merging.

singleRun: false prevents the tests from completing, as the first
set of tests never exits, so the second set of tests never begins.
@ericblade
Copy link
Collaborator Author

So, this gets the test process working non-interactively, but there are 3 errors existing in the current test cases. I haven't yet had time to see if the tests themselves are broken, or if there's something up with the library.

First error:

'image-001.jpg'
PhantomJS 2.1.1 (Linux 0.0.0): Executed 1 of 11 SUCCESS (0 secs / 0.944 secs)
PhantomJS 2.1.1 (Linux 0.0.0) decodeSingle EAN-extended should decode base/test/fixtures/ean_extended/ correctly FAILED
	TypeError: undefined is not an object (evaluating 'result.codeResult.code') (test/test-main-integration.js:14676)
PhantomJS 2.1.1 (Linux 0.0.0): Executed 2 of 11 (1 FAILED) (0 secs / NaN secs)
'image-001.jpg'

Second error (looks to be same as first error, different test though)

'image-007.jpg'
PhantomJS 2.1.1 (Linux 0.0.0): Executed 3 of 11 (1 FAILED) (0 secs / NaN secs)
PhantomJS 2.1.1 (Linux 0.0.0) decodeSingle Code39 should decode base/test/fixtures/code_39/ correctly FAILED
	TypeError: undefined is not an object (evaluating 'result.codeResult.code') (test/test-main-integration.js:14676)
PhantomJS 2.1.1 (Linux 0.0.0): Executed 4 of 11 (2 FAILED) (0 secs / NaN secs)
'image-001.jpg'

Third error looks like a decode error

'image-005.jpg'
PhantomJS 2.1.1 (Linux 0.0.0): Executed 4 of 11 (2 FAILED) (0 secs / NaN secs)
PhantomJS 2.1.1 (Linux 0.0.0) decodeSingle EAN-8 should decode base/test/fixtures/ean_8/ correctly FAILED
	AssertionError: expected '74070749' to equal '90162602' (node_modules/chai/chai.js:210)
PhantomJS 2.1.1 (Linux 0.0.0): Executed 5 of 11 (3 FAILED) (0 secs / NaN secs)
'image-001.jpg'

@ericblade
Copy link
Collaborator Author

ok, so, for ean_extended/image-001.jpg and code_39/image-007.jpg the callback is being called without a codeResult. For ean_8/image-005.jpg, it's returning the wrong code.

@ericblade
Copy link
Collaborator Author

ericblade commented Jun 2, 2018

Sometimes, the 3rd error (ean_8/image-005.jpg) doesn't come up, but sometimes a different third one does:

LOG: 'image-001.jpg'
PhantomJS 2.1.1 (Windows 8.0.0) decodeSingle UPC-E should decode base/test/fixtures/upc_e/ correctly FAILED
        AssertionError: expected undefined to be an object (http:https://localhost:9877node_modules/chai/chai.js:210)

So, it seems we have some non-deterministic results coming from ean_8 and upc_e .. and some total failures from ean_2_reader and code_39.

@ericblade
Copy link
Collaborator Author

Consistently, on the travis machine that was doing testing, the tests that I commented out here failed 74d8673

... but they did not fail for me on my local machine. It seems to me that there's definitely some strange things afoot in some of the readers, possibly?

@ericblade
Copy link
Collaborator Author

I don't want to merge this until there's some input from others -- not just "looks good", but in regards to the actual functionality -- I couldn't get the tests to run in Chrome at all, I don't know if I was failing at some configuration step or missing some step, or what.

I don't know if there's any disadvantages to running in PhantomJS versus Chrome, so long as Phantom has the support for what is needed (which it seems to, after plugging in the Promises polyfill for it) . . I would think that for now, it would be good to go with it.

I'm not sure if there's an easy way to enable the "watch" mode for testing -- I disabled it here so that the Travis builds would complete, instead of hang watching for file updates .. i'd definitely like to have file watcher on for tests, though, that's super convenient to do quick changes in development.

Basically, I'm sure that this is a change that is in the right direction, for the time being, but we may have to go in a different direction with some of these changes, or add more to these changes, such as not eliminating test watching. :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant