Hi! Rust Analyzer failed domain resolution in my solution.
It said, that it failed to download https://index.crates.io/config.json
due to domain resolution failure for index.crates.io, but it works OK from my end.
This lead to a failure to recognize an extra dependency, which was put in place by the author of the exercise (“Gigasecond”).
Thanks a lot for the report. I recently implemented clippy integration, which is failing in this case. The analyzer doesn’t have access to dependencies like the test runner does… So I guess the quickest way to solve the issue would be to disable clippy on submissions with dependencies. Optionally, we could add dependency support just like in the test runner later. But there are some challenges with that, e.g. ensuring the test runner and analyzer always work with the same set of supported dependencies…
PR to disable clippy if dependencies are used
@ErikSchierboom What do you think, is it a good idea to use cargo-local-registry in the analyzer as well? There would be a slight increase in image size, but I’m more concerned about how to keep the test runner and analyzer in sync.
possible approaches that come to my mind:
- Copy-paste all the cargo-local-registry related stuff. And hope we never forget to copy-paste any changes in one place to the other.
- Create a new repo which builds a docker image just with the cargo-local-registry stuff, which both test runner and analyzer can then build upon in their Dockerfile.
I don’t think it’s a terrible loss if clippy just doesn’t run on exercises with dependencies. But it would be nice…
My 2 cents:
-
when there’s a dependency in the exercise, it’s a very concerte package and version — could the toolchain be tricked to use a locally stored concrete version of selected packages? They would only potentially change when someone changes the exercise;
-
if you’d allow any package dependency, which I understand is more complex and less secure, it would extend the number of possible solutions.
That being said, if (1) is achievable, it would solve the most of common case of someone following the instructions ;)
This is exactly what we do with the test runner. Here’s the list of exact dependencies that are stored within the test runner image so they can be used without fetching from crates.io over the network.
We just haven’t done that for the analyzer. And whether or not it’s worth it is an open question.
Btw. the fix is deployed, should be working now.
I would go with this one. To fix the issue of being possible to forget to update in both places, I think a comment on both .toml files saying that if you update one you must also update the other is enough.
In the Go track, we also create a fake module just so the Go toolchain can download the packages to build into the test runner. In our case, we must make sure the Go version that fake module uses is the same as the version of the modules given to students use. This comment has been enough, and already made me not forget to keep things in sync at least twice.
Wouldn’t it be possible to download the required files from the test-runner in the analyser CI build action? That way you only have to re-build the analyser when changing the test-runner dependencies.
I’m fine with either! The Nim tools also have their own base image: GitHub - exercism/nim-docker-base: A base docker image for Exercism's nim tooling
The Rust analyzer is now working for submissions with dependencies as well. (PR for reference)