Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Set up a testing infrastructure with different platforms #1572

Closed
6 tasks done
axic opened this issue Jan 17, 2017 · 29 comments
Closed
6 tasks done

Set up a testing infrastructure with different platforms #1572

axic opened this issue Jan 17, 2017 · 29 comments
Assignees

Comments

@axic
Copy link
Member

axic commented Jan 17, 2017

Solidity should be compiled on different platforms as part of the CI (already done) and the tests should be run on each of the platforms, comparing the resulting bytecode.

  • generate and store code for linux
  • generate and store code for emscripten
  • generate and store code for windows
  • generate and store code for macos
  • all of the above with and without optimizer
  • job that performs comparisons (should run for each push and check complete data, but also daily to see if data is incomplete)

--
Probably it would make sense setting up a testing infrastructure to run solc-js, solc@linux and solc@windows, both with gcc and clang, against some contracts to ensure that solc always behaves the same.

cc @holiman

@axic
Copy link
Member Author

axic commented Jan 17, 2017

Note: we already test linux-gcc, linux-clang and windows-msvcc, but don't compare that against solc-js.

@federicobond
Copy link
Contributor

Does it make sense to include osx-clang too? It shouldn't be hard to setup with Travis.

@axic
Copy link
Member Author

axic commented Jan 18, 2017

@federicobond Mac is already supported, but was disabled due to being too slow on the CI (~40 mins or so).

Either we could:

  1. speed it up and reenable it
    2.a) add Emscripten compilation and
    2.b) run the Solidity tests against solc-js

Probably 2.b doesn't really belong here and thus it should be a separate project, but 2.a could be very useful to at least.

@chriseth
Copy link
Contributor

@axic I don't think there is a way in travis to take the build artifact from two steps and have another step that uses it. We could download and execute solc-js in the ubuntu build and vice-versa, but that will trigger errors if we change something in the code generator. So unless travis supports some kind of complex build artefact generation paths, the only way I see is to do something like setting up a separate repository with separately triggered build process that downloads the artifacts from travis...

@chriseth
Copy link
Contributor

Ok, just spent a little more thoughts on this. This is how it could work: If all parallel travis runs are successful, a "deploy" step is run. This deploy step will copy the build artifacts to a special github repository. That repository has a travis setup that will take the binaries and test them against each other. The only question I still have is whether the deploy step has access to all binaries or only to the binaries that correspond to that build step and if it is the latter how to synchronize uploading to the other github repository. I guess there should be a simpler solution :-)

@VoR0220
Copy link
Member

VoR0220 commented Jan 19, 2017

god...Travis is so complicated in all the wrong ways....I've been reading over this documentation and trying to figure out what to do to make this work "just" right...and yea...if I'm going to be honest, the CI setup in general could probably use something of a makeover. BTW...do we still need to run the tests 3 times or has that generally been eliminated as a concern at this point?

@VoR0220
Copy link
Member

VoR0220 commented Jan 19, 2017

@VoR0220
Copy link
Member

VoR0220 commented Jan 19, 2017

Chris, there's a simpler way that doesn't involve pushing to a separate repository. We just need to find a way to emscripten soltest...so the tests would look like the following pseudo travis script:

Build:

Windows: gcc/clang
Use windows specific script

OS X: gcc/clang
Use OS X specific script

Linux: gcc/clang
Use Linux specific script

Emscripten: gcc/clang
Use emscripten specific script

Docs: whatever the docs generator does

Docker:
run Docker build with soltest enabled

Run:

(inside a script obviously)
soltest

Deploy: (this assumes all of them pass and that the branch that we want to merge to is either develop or release)

provider: script: (run both emscripten deploy and docker push)

provider: releases (github): push tarballs, zips, and dmgs

Granted...this is all easier said than done....requires quite a refactoring imo.

@VoR0220
Copy link
Member

VoR0220 commented Jan 19, 2017

scratch the Travis bit for Windows...Travis doesn't support windows...but you get the high level idea of how this should work imo.

@federicobond
Copy link
Contributor

federicobond commented Jan 19, 2017 via email

@VoR0220
Copy link
Member

VoR0220 commented Jan 19, 2017

correct. My thoughts exactly. The good news is that the majority of the heavy lifting has already been done (deps installation in particular).

@VoR0220 VoR0220 self-assigned this Jan 19, 2017
@VoR0220
Copy link
Member

VoR0220 commented Jan 19, 2017

I figure since I'm doing the docker deployment, might as well make the entire thing better while I'm here. Shouldn't be too hard.

@axic
Copy link
Member Author

axic commented Jan 19, 2017

Note: we already have AppVeyor run the tests :)

We are only lacking:

  • Mac tests (they are technically still in the tree, but disabled - no big hassle writing it)
  • Emscripten tests (this needs to be written)

In order to do the Emscripten tests, as @VoR0220 mentioned, we could compile the testing framework with Emscripten. Probably that is one of the easiest ways.

@chriseth
Copy link
Contributor

I would still like to compare the binaries produced from the different platforms. I'm also pretty sure that his is much faster to do, since most time in the tests is not spent on compiling but rather executing the tests. It is enough to run the tests on a single platform and only compare the produced bytecode on the other platforms.

@axic
Copy link
Member Author

axic commented Jan 19, 2017

It is enough to run the tests on a single platform and only compare the produced bytecode on the other platforms.

Are we talking about adding new tests to produce bytecode or taking the bytecode from the current tests?

@VoR0220
Copy link
Member

VoR0220 commented Jan 19, 2017

@chriseth I too am curious about whether you are talking about adding new tests to produce bytecode or taking the bytecode from current tests.

@chriseth
Copy link
Contributor

Either or.

@VoR0220
Copy link
Member

VoR0220 commented Jan 20, 2017

I would prefer the latter. That's far easier to add to it. I think.

@VoR0220
Copy link
Member

VoR0220 commented Jan 21, 2017

So I think I figured out why OS X was running so slow. The main problem centers around brew and the fact that we were upgrading all of Travis' brew suite...this led to insanely long times updating. I've fixed this but it's still fairly slow due to the cpp-eth dependency and how long this takes to build. I have an idea to make this thing run even faster for all builds. All we need is a properly pushed stable cpp-ethereum docker image. From there this becomes a trivial matter to solve. It's also required for our docker deployment. So any chance we could get that up @chriseth ? After that it becomes a docker pull and we call the rpc of the cpp-eth image. Furthermore, it also enables us to test on multiple platforms and bypass travis' limitations. So...if someone has that it would be very appreciated.

@VoR0220
Copy link
Member

VoR0220 commented Jan 21, 2017

nevermind, it appears there is one already there. I shall use it!

@VoR0220
Copy link
Member

VoR0220 commented Jan 23, 2017

Just to clarify...we do in fact already test the bytecode and assert that the bytecode will be X in many of the end to end tests...correct? So...im confused i guess. What needs to be added here outside of an emscripten soltest that also passes all the tests? Could we not assume logically that then the bytecode would be the same?

@VoR0220
Copy link
Member

VoR0220 commented Jan 23, 2017

Ahhhhh...I reread it a couple times and now I think it makes sense at least for the end to end tests...this way we could segregate it from the rest of the normal tests, and set a flag to either give back the generated bytecode or to actually run the tests against the IPC. We then can deposit the generated bytecode into artifacts, upload them to an S3 instance where we then compare all the artifacts from different distros.

@VoR0220 VoR0220 closed this as completed Jan 23, 2017
@VoR0220 VoR0220 reopened this Jan 23, 2017
@VoR0220
Copy link
Member

VoR0220 commented Jan 23, 2017

That said, I think that the rest of the tests that are not node reliant (tests that don't rely on the cpp-eth client for testing) definitely still need to be run. This might take a bit longer than I expected if that's the case.

@chriseth
Copy link
Contributor

EndToEnd tests generate bytecode but the actual tests run checks on the semantics of that bytecode. it might be challenging to find suitable identifiers for the generated bytecode.

@axic
Copy link
Member Author

axic commented Feb 3, 2017

Maybe it would also make sense running fuzzing (#1125) on these platforms?

@axic
Copy link
Member Author

axic commented Feb 7, 2017

Sorting this (ethereum/interfaces#4) would also allow running the tests against cpp-ethereum and go-ethereum (possibly parity).

@chriseth chriseth added the soon label Mar 8, 2017
@chriseth chriseth assigned chriseth and unassigned VoR0220 Mar 15, 2017
@chriseth
Copy link
Contributor

Working on this now.

@chriseth chriseth added in progress and removed soon labels Mar 15, 2017
@chriseth
Copy link
Contributor

chriseth commented May 3, 2017

The comparison is broken - it does not compare the latest commit. Also, the repository currently grows and grows, we should clean (and rebase) it from time to time or use branches that are deleted after some time.

@chriseth
Copy link
Contributor

chriseth commented Oct 9, 2017

This is more or less done. Closing.

@chriseth chriseth closed this as completed Oct 9, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants