2389c7cab3
While working on https://github.com/ooni/probe/issues/2130, and specifically on the action item related to making sure all workflows are green, I was confronted with the complexity of the QA directory. There's plenty of cleaning up and simplifying there. The original intent was to A/B test `miniooni` and `measurement_kit` to ensure they were behaving the same. We don't have this need anymore. Rather, it seems the QA scripts have grown large and flaky, to the point that I am always tempted to ignore them. The underlying censorship engine, jafar, has also not been developed for quite some time. So, the first step towards improve the QA infrastructure seems to be humble and acknowledge that we cannot realistically maintain these checks using jafar as a backend for so many experiments. Let us focus on our most important experiment, Web Connectivity, and let us keep QA checks for it. Additionally, let us simplify and cleanup QA as much as possible, though without introducing radical changes. The end result is a QA for Web Connectivity that seems reasonable and runs in six minutes.
866 B
866 B
Quality Assurance scripts
This directory contains quality assurance scripts that use Jafar to ensure that OONI implementations behave. These scripts work with miniooni.
Run QA using a docker container
Run test in a suitable Docker container using:
./QA/rundocker.sh $nettest
Note that this will run a --privileged
docker container.
Diagnosing issues
The Python script that performs the QA runs a specific OONI test under
different failure conditions and stops at the first unexpected value found
in the resulting JSONL report. You can infer what went wrong by reading
the output of the miniooni
command itself, which should be above the point
where the Python script stopped, as well as by inspecting the JSONL file on
disk. By convention such file is named $nettest.jsonl
and only contains
the result of the last run of $nettest
.