Additional testing prerequisites

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
19 messages Options
Reply | Threaded
Open this post in threaded view
|

Additional testing prerequisites

Stefan Seefeld-2
Hello,

I recently added support for NumPy to the Boost.Python module. To
compile (and test) that, the development environment needs to include
Python's NumPy package. As I don't see the Boost.Python NumPy tests on
http://www.boost.org/development/tests/develop/developer/python.html, I
suspect none of the test machines have that Python NumPy package installed.

What is the right way to update these ?


Thanks,

        Stefan


--

      ...ich hab' noch einen Koffer in Berlin...

_______________________________________________
Boost-Testing mailing list
[hidden email]
http://lists.boost.org/mailman/listinfo.cgi/boost-testing
Reply | Threaded
Open this post in threaded view
|

Re: Additional testing prerequisites

Tom Kent


On Wed, Oct 19, 2016 at 9:25 AM, Stefan Seefeld <[hidden email]> wrote:
Hello,

I recently added support for NumPy to the Boost.Python module. To
compile (and test) that, the development environment needs to include
Python's NumPy package. As I don't see the Boost.Python NumPy tests on
http://www.boost.org/development/tests/develop/developer/python.html, I
suspect none of the test machines have that Python NumPy package installed.

What is the right way to update these ?

I just ran "pip install numpy" on my teeks99-09 machine, lets see if those runners start hitting it.

In general, I think we seriously need to update the "Running Regression Tests" page (http://www.boost.org/development/running_regression_tests.html) with lots more details on how to get a runner up and going. Nowhere on that page does it mention python needs to be added to the user-config.jam file in order to complete these tests. If I'm not mistaken there are other external dependencies that are needed for effective boost testing (zlib, bz2 for iostreams...others?). 

Specifically for python, since the library supports python 2 and 3 should both of those be installed? How do we configure user-config.jam to use both versions and how do we make sure that the test run hits both versions? How about python 32 bit vs. 64 bit? If I just install 32-bit python to use as the test runner, but I do a build with address-model=64, I don't think that will allow for testing the python library, correct?

Thoughts?

Tom

_______________________________________________
Boost-Testing mailing list
[hidden email]
http://lists.boost.org/mailman/listinfo.cgi/boost-testing
Reply | Threaded
Open this post in threaded view
|

Re: Additional testing prerequisites

Stefan Seefeld-2
On 19.10.2016 18:11, Tom Kent wrote:

>
>
> On Wed, Oct 19, 2016 at 9:25 AM, Stefan Seefeld <[hidden email]
> <mailto:[hidden email]>> wrote:
>
>     Hello,
>
>     I recently added support for NumPy to the Boost.Python module. To
>     compile (and test) that, the development environment needs to include
>     Python's NumPy package. As I don't see the Boost.Python NumPy tests on
>     http://www.boost.org/development/tests/develop/developer/python.html
>     <http://www.boost.org/development/tests/develop/developer/python.html>,
>     I
>     suspect none of the test machines have that Python NumPy package
>     installed.
>
>     What is the right way to update these ?
>
>
> I just ran "pip install numpy" on my teeks99-09 machine, lets see if
> those runners start hitting it.

Thanks, I'll keep an eye on the test URL...

> In general, I think we seriously need to update the "Running
> Regression Tests" page
> (http://www.boost.org/development/running_regression_tests.html) with
> lots more details on how to get a runner up and going. Nowhere on that
> page does it mention python needs to be added to the user-config.jam
> file in order to complete these tests. If I'm not mistaken there are
> other external dependencies that are needed for effective boost
> testing (zlib, bz2 for iostreams...others?).

Yeah. An MPI implementation for Boost.MPI comes to mind, too...

> Specifically for python, since the library supports python 2 and 3
> should both of those be installed? How do we configure user-config.jam
> to use both versions and how do we make sure that the test run hits
> both versions? How about python 32 bit vs. 64 bit? If I just install
> 32-bit python to use as the test runner, but I do a build with
> address-model=64, I don't think that will allow for testing the python
> library, correct?

All good questions. I'm cross-posting my reply to the Boost.Build list,
as I figure people there might have some of the answers (notably how to
configure the build environment).

I fully agree about the need for a formal document describing the setup
of a test machine. In fact, I wonder whether it wouldn't be useful to
set up a few containers with various platforms (OSes, compilers, etc.),
which contributors could then download to run test on. That would be
very convenient for contributors.

On a related note, the
http://www.boost.org/development/tests/develop/developer/python.html
test matrix displays a disturbing number of failing test runs (runs
where almost all tests fail, suggesting a setup problem, rather than a
problem with individual tests), and I as the Boost.Python maintainer
find myself unable to even try to reproduce or fix those.
For now I have set up my own testing on travis-ci (where I only build
and test Boost.Python using SCons, instead of Boost.Build), but
ultimately I would like to be able to understand all the above failures.
Ideally one could figure out a single setup issue and thus flag an
entire test run as invalid, improving the signal-to-noise ratio of he
tests. I believe all this would be vastly helped using pre-defined
containers...

>
> Thoughts?
>
> Tom
        Stefan

--

      ...ich hab' noch einen Koffer in Berlin...

_______________________________________________
Boost-Testing mailing list
[hidden email]
http://lists.boost.org/mailman/listinfo.cgi/boost-testing
Reply | Threaded
Open this post in threaded view
|

Re: [Boost-build] Additional testing prerequisites

Stefan Seefeld-2
On 20.10.2016 19:27, Niklas Angare wrote:

> "Stefan Seefeld" wrote:
>> On a related note, the
>> http://www.boost.org/development/tests/develop/developer/python.html
>> test matrix displays a disturbing number of failing test runs (runs
>> where almost all tests fail, suggesting a setup problem, rather than a
>> problem with individual tests), and I as the Boost.Python maintainer
>> find myself unable to even try to reproduce or fix those.
>> For now I have set up my own testing on travis-ci (where I only build
>> and test Boost.Python using SCons, instead of Boost.Build), but
>> ultimately I would like to be able to understand all the above failures.
>> Ideally one could figure out a single setup issue and thus flag an
>> entire test run as invalid, improving the signal-to-noise ratio of he
>> tests. I believe all this would be vastly helped using pre-defined
>> containers...
>
> For my test runner NA-QNX650-SP1-x86 which has Python 2.5.2, at least
> some of the failures seem to be caused by the test code trying to use
> newer Python features. The documentation for Boost.Python claims to
> require only Python 2.2. Did the author of those tests forget to
> maintain compatibility, or are those tests only relevant to newer
> versions? If it's the latter, perhaps those tests shouldn't even be
> run when the Python version is too old.

Can you point me to the specific tests / test failures; I may have a
look. (Ideally please follow up with Boost.Python issues in
https://github.com/boostorg/python/issues)

>
> My other runner NA-QNX650-SP1-ARM is cross compiling and the target
> environment doesn't have Python so testing Boost.Python is not
> desirable. Should I disable it with --bjam-options="--without-python"?

I think so (though I'd expect boost.build to do that automatically if
Python can't be detected at config time.)

>
> If you want more information about the configuration of the test
> runners, you could add a test that simply outputs diagnostic
> information unconditionally. For example config_test from Boost.System
> or config_info from Boost.Config do this.
>
> Regards,
>
> Niklas Angare

Thanks,
        Stefan


--

      ...ich hab' noch einen Koffer in Berlin...

_______________________________________________
Boost-Testing mailing list
[hidden email]
http://lists.boost.org/mailman/listinfo.cgi/boost-testing
Reply | Threaded
Open this post in threaded view
|

Re: Additional testing prerequisites

Stefan Seefeld-2
In reply to this post by Tom Kent
On 19.10.2016 18:11, Tom Kent wrote:

>
>
> On Wed, Oct 19, 2016 at 9:25 AM, Stefan Seefeld <[hidden email]
> <mailto:[hidden email]>> wrote:
>
>     Hello,
>
>     I recently added support for NumPy to the Boost.Python module. To
>     compile (and test) that, the development environment needs to include
>     Python's NumPy package. As I don't see the Boost.Python NumPy tests on
>     http://www.boost.org/development/tests/develop/developer/python.html
>     <http://www.boost.org/development/tests/develop/developer/python.html>,
>     I
>     suspect none of the test machines have that Python NumPy package
>     installed.
>
>     What is the right way to update these ?
>
>
> I just ran "pip install numpy" on my teeks99-09 machine, lets see if
> those runners start hitting it.

Unfortunately I still don't see any NumPy tests in any of the test runs,
and there are no test logs (that I'm aware of) to look at to see what's
going on.
Could you please also add numpy to the other machines (notably the ones
running Linux) ?

While I'm of course able to test Boost.Python locally (with the desired
effects), I haven't yet been able to do a full test run:
Running the `run.py` script fails for me as early as checkout
(https://github.com/boostorg/regression/issues/39), so I'm blocked there.

I really hope we can resolve this soon, so all this work can be merged
into master in time for the 1.63 release !

Thanks,
        Stefan

--

      ...ich hab' noch einen Koffer in Berlin...

_______________________________________________
Boost-Testing mailing list
[hidden email]
http://lists.boost.org/mailman/listinfo.cgi/boost-testing
Reply | Threaded
Open this post in threaded view
|

Re: Additional testing prerequisites

Niklas Angare
In reply to this post by Stefan Seefeld-2
"Stefan Seefeld" <[hidden email]> wrote:
> On 20.10.2016 19:27, Niklas Angare wrote:
>> For my test runner NA-QNX650-SP1-x86 which has Python 2.5.2, at least
>> some of the failures seem to be caused by the test code trying to use
>> newer Python features. The documentation for Boost.Python claims to
>> require only Python 2.2.
...
> Can you point me to the specific tests / test failures; I may have a
> look. (Ideally please follow up with Boost.Python issues in
> https://github.com/boostorg/python/issues)

I'm just running tests because I want Boost to be usable on QNX.
Boost.Python is not a library I foresee myself using so I'm not inclined to
spend a lot of time on it. This suspected issue of Python version
compatibility is probably not limited to QNX, but you may be able to learn
about it by looking at my test results.

Just go to the develop summary matrix for Python and click on the failures
in the "NA-QNX650-SP1-x86" column. I'll give you a taste:

test python - args:
  File "../libs/python/test/args.py", line 4
    from __future__ import print_function
SyntaxError: future feature print_function is not defined

test python - bienstman3:
<doctest __main__[1]>:3: Warning: 'as' will become a reserved keyword in
Python 2.6
File "../libs/python/test/bienstman3.py", line 7, in __main__

test python - builtin_converters:
  File "../libs/python/test/test_builtin_converters.py", line 5, in <module>
    if (sys.version_info.major >= 3):
AttributeError: 'tuple' object has no attribute 'major'

>> My other runner NA-QNX650-SP1-ARM is cross compiling and the target
>> environment doesn't have Python so testing Boost.Python is not
>> desirable. Should I disable it with --bjam-options="--without-python"?
>
> I think so (though I'd expect boost.build to do that automatically if
> Python can't be detected at config time.)

For NA-QNX650-SP1-ARM, Python is present and working on the host running
Boost.Build, but not on the target. I'm not aware of any way to let
Boost.Build know this. I've added --without-python but it will take a day or
two before the tests are re-run.

Regards,

Niklas Angare
 


_______________________________________________
Boost-Testing mailing list
[hidden email]
http://lists.boost.org/mailman/listinfo.cgi/boost-testing
Reply | Threaded
Open this post in threaded view
|

Re: Additional testing prerequisites

Stefan Seefeld-2
On 25.10.2016 17:44, Niklas Angare wrote:

> "Stefan Seefeld" <[hidden email]> wrote:
>> On 20.10.2016 19:27, Niklas Angare wrote:
>>> For my test runner NA-QNX650-SP1-x86 which has Python 2.5.2, at least
>>> some of the failures seem to be caused by the test code trying to use
>>> newer Python features. The documentation for Boost.Python claims to
>>> require only Python 2.2.
> ...
>> Can you point me to the specific tests / test failures; I may have a
>> look. (Ideally please follow up with Boost.Python issues in
>> https://github.com/boostorg/python/issues)
>
> I'm just running tests because I want Boost to be usable on QNX.
> Boost.Python is not a library I foresee myself using so I'm not
> inclined to spend a lot of time on it.

OK, fair enough.


> This suspected issue of Python version compatibility is probably not
> limited to QNX, but you may be able to learn about it by looking at my
> test results.
>
> Just go to the develop summary matrix for Python and click on the
> failures in the "NA-QNX650-SP1-x86" column. I'll give you a taste:
>
> test python - args:
>  File "../libs/python/test/args.py", line 4
>    from __future__ import print_function
> SyntaxError: future feature print_function is not defined
>
> test python - bienstman3:
> <doctest __main__[1]>:3: Warning: 'as' will become a reserved keyword
> in Python 2.6
> File "../libs/python/test/bienstman3.py", line 7, in __main__
>
> test python - builtin_converters:
>  File "../libs/python/test/test_builtin_converters.py", line 5, in
> <module>
>    if (sys.version_info.major >= 3):
> AttributeError: 'tuple' object has no attribute 'major'

OK, so it seems our testing logic (at least) actually requires a more
recent Python version. At this point in time I'm not sure how much
effort it is worth spending on the testing harness to support older
version of Python. And since you aren't interested into Boost.Python
yourself, perhaps it would be easier if you disable it alltogether in
your test runs, so as to reduce your testing workload and reduce the
noise in the Boost.Python test matrix.

>>> My other runner NA-QNX650-SP1-ARM is cross compiling and the target
>>> environment doesn't have Python so testing Boost.Python is not
>>> desirable. Should I disable it with --bjam-options="--without-python"?
>>
>> I think so (though I'd expect boost.build to do that automatically if
>> Python can't be detected at config time.)
>
> For NA-QNX650-SP1-ARM, Python is present and working on the host
> running Boost.Build, but not on the target. I'm not aware of any way
> to let Boost.Build know this. I've added --without-python but it will
> take a day or two before the tests are re-run.

OK, thanks !

>
> Regards,
>
> Niklas Angare

        Stefan

--

      ...ich hab' noch einen Koffer in Berlin...

_______________________________________________
Boost-Testing mailing list
[hidden email]
http://lists.boost.org/mailman/listinfo.cgi/boost-testing
Reply | Threaded
Open this post in threaded view
|

Re: Additional testing prerequisites

Stefan Seefeld-2
In reply to this post by Stefan Seefeld-2
Tom,

On 25.10.2016 10:52, Stefan Seefeld wrote:

> On 19.10.2016 18:11, Tom Kent wrote:
>>
>> On Wed, Oct 19, 2016 at 9:25 AM, Stefan Seefeld <[hidden email]
>> <mailto:[hidden email]>> wrote:
>>
>>     Hello,
>>
>>     I recently added support for NumPy to the Boost.Python module. To
>>     compile (and test) that, the development environment needs to include
>>     Python's NumPy package. As I don't see the Boost.Python NumPy tests on
>>     http://www.boost.org/development/tests/develop/developer/python.html
>>     <http://www.boost.org/development/tests/develop/developer/python.html>,
>>     I
>>     suspect none of the test machines have that Python NumPy package
>>     installed.
>>
>>     What is the right way to update these ?
>>
>>
>> I just ran "pip install numpy" on my teeks99-09 machine, lets see if
>> those runners start hitting it.
> Unfortunately I still don't see any NumPy tests in any of the test runs,
> and there are no test logs (that I'm aware of) to look at to see what's
> going on.
> Could you please also add numpy to the other machines (notably the ones
> running Linux) ?

I just ran the tests locally (Linux) and verified that the NumPy
extension is properly built and the tests are executed and pass.
Would it be possible for you to send me your bjam.log file (privately),
or look into it to see whether there is any mention of "numpy" ?

I'm wondering whether the absence of these tests in the test matrix in
http://www.boost.org/development/tests/develop/developer/python.html
indicates that these tests weren't run in any test run yet, or whether
it's merely a failure in the post-processing of the result logs.

Many thanks !

        Stefan


--

      ...ich hab' noch einen Koffer in Berlin...

_______________________________________________
Boost-Testing mailing list
[hidden email]
http://lists.boost.org/mailman/listinfo.cgi/boost-testing
Reply | Threaded
Open this post in threaded view
|

Re: Additional testing prerequisites

Steve M. Robbins-2
In reply to this post by Stefan Seefeld-2
On Tuesday, October 25, 2016 10:52:55 AM CDT Stefan Seefeld wrote:

> Unfortunately I still don't see any NumPy tests in any of the test runs,
> and there are no test logs (that I'm aware of) to look at to see what's
> going on.
> Could you please also add numpy to the other machines (notably the ones
> running Linux) ?

I have had numpy installed on my tester ("Debian-Sid") for years, but no numpy
tests have appeared in the web summary.  Is there some other action I should
take?

My local build log indicates a bunch of numpy tests being built, and running
successfully.  Within results/boost/bin.v2/libs/python/test/numpy, grep -r
EXIT . shows:

./ndarray.test/gcc-6.2.0/debug/numpy/ndarray.output:EXIT STATUS: 0
./ndarray.test/gcc-6.2.0/debug/numpy/ndarray:EXIT STATUS: 0
./ndarray.test/gcc-6.2.0/debug/numpy/test_log.xml:EXIT STATUS: 0
./dtype.test/gcc-6.2.0/debug/numpy/dtype.output:EXIT STATUS: 0
./dtype.test/gcc-6.2.0/debug/numpy/dtype:EXIT STATUS: 0
./dtype.test/gcc-6.2.0/debug/numpy/test_log.xml:EXIT STATUS: 0
./ufunc.test/gcc-6.2.0/debug/numpy/ufunc:EXIT STATUS: 0
./ufunc.test/gcc-6.2.0/debug/numpy/ufunc.output:EXIT STATUS: 0
./ufunc.test/gcc-6.2.0/debug/numpy/test_log.xml:EXIT STATUS: 0
./indexing.test/gcc-6.2.0/debug/numpy/indexing:EXIT STATUS: 0
./indexing.test/gcc-6.2.0/debug/numpy/test_log.xml:EXIT STATUS: 0
./indexing.test/gcc-6.2.0/debug/numpy/indexing.output:EXIT STATUS: 0
./shapes.test/gcc-6.2.0/debug/numpy/shapes.output:EXIT STATUS: 0
./shapes.test/gcc-6.2.0/debug/numpy/shapes:EXIT STATUS: 0
./shapes.test/gcc-6.2.0/debug/numpy/test_log.xml:EXIT STATUS: 0
./templates.test/gcc-6.2.0/debug/numpy/templates:EXIT STATUS: 0
./templates.test/gcc-6.2.0/debug/numpy/templates.output:EXIT STATUS: 0
./templates.test/gcc-6.2.0/debug/numpy/test_log.xml:EXIT STATUS: 0

I've no clue why this won't show up on the web page.
-Steve

_______________________________________________
Boost-Testing mailing list
[hidden email]
http://lists.boost.org/mailman/listinfo.cgi/boost-testing

signature.asc (817 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Additional testing prerequisites

Rene Rivera-2
On Tue, Oct 25, 2016 at 10:33 PM, Steve M. Robbins <[hidden email]> wrote:
On Tuesday, October 25, 2016 10:52:55 AM CDT Stefan Seefeld wrote:

> Unfortunately I still don't see any NumPy tests in any of the test runs,
> and there are no test logs (that I'm aware of) to look at to see what's
> going on.
> Could you please also add numpy to the other machines (notably the ones
> running Linux) ?

I have had numpy installed on my tester ("Debian-Sid") for years, but no numpy
tests have appeared in the web summary.  Is there some other action I should
take?

My local build log indicates a bunch of numpy tests being built, and running
successfully.  Within results/boost/bin.v2/libs/python/test/numpy, grep -r
EXIT . shows:

./ndarray.test/gcc-6.2.0/debug/numpy/ndarray.output:EXIT STATUS: 0
./ndarray.test/gcc-6.2.0/debug/numpy/ndarray:EXIT STATUS: 0
./ndarray.test/gcc-6.2.0/debug/numpy/test_log.xml:EXIT STATUS: 0
[...] 
./templates.test/gcc-6.2.0/debug/numpy/test_log.xml:EXIT STATUS: 0

I've no clue why this won't show up on the web page.

There's a variety of reasons it might not.. Which means I'll have to look into the processing to find out where it breaks :-(



--
-- Rene Rivera
-- Grafik - Don't Assume Anything
-- Robot Dreams - http://robot-dreams.net
-- rrivera/acm.org (msn) - grafikrobot/aim,yahoo,skype,efnet,gmail

_______________________________________________
Boost-Testing mailing list
[hidden email]
http://lists.boost.org/mailman/listinfo.cgi/boost-testing
Reply | Threaded
Open this post in threaded view
|

Re: Additional testing prerequisites

Stefan Seefeld-2
In reply to this post by Steve M. Robbins-2
On 25.10.2016 23:33, Steve M. Robbins wrote:

> On Tuesday, October 25, 2016 10:52:55 AM CDT Stefan Seefeld wrote:
>
>> Unfortunately I still don't see any NumPy tests in any of the test runs,
>> and there are no test logs (that I'm aware of) to look at to see what's
>> going on.
>> Could you please also add numpy to the other machines (notably the ones
>> running Linux) ?
> I have had numpy installed on my tester ("Debian-Sid") for years, but no numpy
> tests have appeared in the web summary.  Is there some other action I should
> take?
>
> My local build log indicates a bunch of numpy tests being built, and running
> successfully.  Within results/boost/bin.v2/libs/python/test/numpy, grep -r
> EXIT . shows:
>
> ./ndarray.test/gcc-6.2.0/debug/numpy/ndarray.output:EXIT STATUS: 0
> ./ndarray.test/gcc-6.2.0/debug/numpy/ndarray:EXIT STATUS: 0
> ./ndarray.test/gcc-6.2.0/debug/numpy/test_log.xml:EXIT STATUS: 0
> ./dtype.test/gcc-6.2.0/debug/numpy/dtype.output:EXIT STATUS: 0
> ./dtype.test/gcc-6.2.0/debug/numpy/dtype:EXIT STATUS: 0
> ./dtype.test/gcc-6.2.0/debug/numpy/test_log.xml:EXIT STATUS: 0
> ./ufunc.test/gcc-6.2.0/debug/numpy/ufunc:EXIT STATUS: 0
> ./ufunc.test/gcc-6.2.0/debug/numpy/ufunc.output:EXIT STATUS: 0
> ./ufunc.test/gcc-6.2.0/debug/numpy/test_log.xml:EXIT STATUS: 0
> ./indexing.test/gcc-6.2.0/debug/numpy/indexing:EXIT STATUS: 0
> ./indexing.test/gcc-6.2.0/debug/numpy/test_log.xml:EXIT STATUS: 0
> ./indexing.test/gcc-6.2.0/debug/numpy/indexing.output:EXIT STATUS: 0
> ./shapes.test/gcc-6.2.0/debug/numpy/shapes.output:EXIT STATUS: 0
> ./shapes.test/gcc-6.2.0/debug/numpy/shapes:EXIT STATUS: 0
> ./shapes.test/gcc-6.2.0/debug/numpy/test_log.xml:EXIT STATUS: 0
> ./templates.test/gcc-6.2.0/debug/numpy/templates:EXIT STATUS: 0
> ./templates.test/gcc-6.2.0/debug/numpy/templates.output:EXIT STATUS: 0
> ./templates.test/gcc-6.2.0/debug/numpy/test_log.xml:EXIT STATUS: 0

Great, thanks for confirming that !

> I've no clue why this won't show up on the web page.

It seems there is some bug in the results (post-)processing preventing
these tests from showing up.

    Stefan


--

      ...ich hab' noch einen Koffer in Berlin...

_______________________________________________
Boost-Testing mailing list
[hidden email]
http://lists.boost.org/mailman/listinfo.cgi/boost-testing
Reply | Threaded
Open this post in threaded view
|

Re: Additional testing prerequisites

Steve M. Robbins-2
On Wed, Oct 26, 2016 at 07:47:25AM -0400, Stefan Seefeld wrote:
> On 25.10.2016 23:33, Steve M. Robbins wrote:

> > I've no clue why this won't show up on the web page.
>
> It seems there is some bug in the results (post-)processing preventing
> these tests from showing up.

Right.  The odd thing is, though: the results do show up for certain
testers.  Just not mine :-(

-S

_______________________________________________
Boost-Testing mailing list
[hidden email]
http://lists.boost.org/mailman/listinfo.cgi/boost-testing

signature.asc (817 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Additional testing prerequisites

Rene Rivera-2
On Wed, Oct 26, 2016 at 9:43 AM, Steve M. Robbins <[hidden email]> wrote:
On Wed, Oct 26, 2016 at 07:47:25AM -0400, Stefan Seefeld wrote:
> On 25.10.2016 23:33, Steve M. Robbins wrote:

> > I've no clue why this won't show up on the web page.
>
> It seems there is some bug in the results (post-)processing preventing
> these tests from showing up.

Right.  The odd thing is, though: the results do show up for certain
testers.  Just not mine :-(

Now I'm confused.. Which tests aren't showing up that should? Which are the numpy tests? What should they look like when working correctly?


--
-- Rene Rivera
-- Grafik - Don't Assume Anything
-- Robot Dreams - http://robot-dreams.net
-- rrivera/acm.org (msn) - grafikrobot/aim,yahoo,skype,efnet,gmail

_______________________________________________
Boost-Testing mailing list
[hidden email]
http://lists.boost.org/mailman/listinfo.cgi/boost-testing
Reply | Threaded
Open this post in threaded view
|

Re: Additional testing prerequisites

Stefan Seefeld-2
On 26.10.2016 11:09, Rene Rivera wrote:

> On Wed, Oct 26, 2016 at 9:43 AM, Steve M. Robbins <[hidden email]
> <mailto:[hidden email]>> wrote:
>
>     On Wed, Oct 26, 2016 at 07:47:25AM -0400, Stefan Seefeld wrote:
>     > On 25.10.2016 23:33, Steve M. Robbins wrote:
>
>     > > I've no clue why this won't show up on the web page.
>     >
>     > It seems there is some bug in the results (post-)processing
>     preventing
>     > these tests from showing up.
>
>     Right.  The odd thing is, though: the results do show up for certain
>     testers.  Just not mine :-(
>
>
> Now I'm confused.. Which tests aren't showing up that should? Which
> are the numpy tests? What should they look like when working correctly?

I have no idea what they should look like (as they have never been
displayed, in fact they don't even show up as row labels).
Hmm, I'm looking at the generated .xml file. The 'test-log' entry for
the numpy tests have empty 'test-name' attributes. That's suspicious...

Can you hop on IRC, that would make follow-up simpler.

(It would be good to have a formal definition of the XML schema used in
these test reports, so I can validate the actual output. But I suspect
no such thing exists, right ?)

Thanks,
        Stefan

--

      ...ich hab' noch einen Koffer in Berlin...

_______________________________________________
Boost-Testing mailing list
[hidden email]
http://lists.boost.org/mailman/listinfo.cgi/boost-testing
Reply | Threaded
Open this post in threaded view
|

Re: Additional testing prerequisites

Steve M. Robbins-2
In reply to this post by Rene Rivera-2
On Wednesday, October 26, 2016 10:09:37 AM CDT Rene Rivera wrote:

> On Wed, Oct 26, 2016 at 9:43 AM, Steve M. Robbins <[hidden email]> wrote:
> > On Wed, Oct 26, 2016 at 07:47:25AM -0400, Stefan Seefeld wrote:
> > > On 25.10.2016 23:33, Steve M. Robbins wrote:
> > > > I've no clue why this won't show up on the web page.
> > >
> > > It seems there is some bug in the results (post-)processing preventing
> > > these tests from showing up.
> >
> > Right.  The odd thing is, though: the results do show up for certain
> > testers.  Just not mine :-(
>
> Now I'm confused.. Which tests aren't showing up that should? Which are the
> numpy tests? What should they look like when working correctly?
I'm looking at the develop summary for python [1] and if you look down the
left side, there is a row labelled "numpy".    If you look across this row,
you will see most of it is blank -- including my tester "Debian Sid", despite
reporting that the numpy tests run -- but a few of them are labelled "pass".  
The passes seem to correlate with non-gcc builds (clang or intel compiler).

Yesterday there was just one "numpy" row, now there is "numpy" and
"numpy~dtype", "numpy~indexing" ... the latter are also mostly blank.

Finally: I should confess that I haven't touched my configuration in months,
possibly a year or more.  I use wget to obtain run.py from github [2].  Maybe
that is out of date now?

-Steve


[1] http://www.boost.org/development/tests/develop/developer/python.html
[2] https://raw.githubusercontent.com/boostorg/regression/develop/testing/src/
run.py

_______________________________________________
Boost-Testing mailing list
[hidden email]
http://lists.boost.org/mailman/listinfo.cgi/boost-testing

signature.asc (817 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Additional testing prerequisites

Rene Rivera-2
On Wed, Oct 26, 2016 at 9:18 PM, Steve M. Robbins <[hidden email]> wrote:
On Wednesday, October 26, 2016 10:09:37 AM CDT Rene Rivera wrote:
> On Wed, Oct 26, 2016 at 9:43 AM, Steve M. Robbins <[hidden email]> wrote:
> > On Wed, Oct 26, 2016 at 07:47:25AM -0400, Stefan Seefeld wrote:
> > > On 25.10.2016 23:33, Steve M. Robbins wrote:
> > > > I've no clue why this won't show up on the web page.
> > >
> > > It seems there is some bug in the results (post-)processing preventing
> > > these tests from showing up.
> >
> > Right.  The odd thing is, though: the results do show up for certain
> > testers.  Just not mine :-(
>
> Now I'm confused.. Which tests aren't showing up that should? Which are the
> numpy tests? What should they look like when working correctly?

I fixed the problem a few hours ago. So now we just need to wait for reports to stabilize.
 
I'm looking at the develop summary for python [1] and if you look down the
left side, there is a row labelled "numpy".    If you look across this row,
you will see most of it is blank -- including my tester "Debian Sid", despite
reporting that the numpy tests run -- but a few of them are labelled "pass".
The passes seem to correlate with non-gcc builds (clang or intel compiler).

Yesterday there was just one "numpy" row, now there is "numpy" and 

That numpy row is the problem. It shouldn't be there, afaict.
 
"numpy~dtype", "numpy~indexing" ... the latter are also mostly blank.

Those are from the fix. And are the new tests.
 
Finally: I should confess that I haven't touched my configuration in months,
possibly a year or more.  I use wget to obtain run.py from github [2].  Maybe
that is out of date now?

Should be OK. But in the past I've set up testing to download a fresh run.py on each run to ensure I always have all the latest code.

--
-- Rene Rivera
-- Grafik - Don't Assume Anything
-- Robot Dreams - http://robot-dreams.net
-- rrivera/acm.org (msn) - grafikrobot/aim,yahoo,skype,efnet,gmail

_______________________________________________
Boost-Testing mailing list
[hidden email]
http://lists.boost.org/mailman/listinfo.cgi/boost-testing
Reply | Threaded
Open this post in threaded view
|

Re: Additional testing prerequisites

Steven Watanabe-4
AMDG

On 10/26/2016 08:31 PM, Rene Rivera wrote:

> On Wed, Oct 26, 2016 at 9:18 PM, Steve M. Robbins <[hidden email]> wrote:
>
>>
>> Finally: I should confess that I haven't touched my configuration in
>> months,
>> possibly a year or more.  I use wget to obtain run.py from github [2].
>> Maybe
>> that is out of date now?
>>
>
> Should be OK. But in the past I've set up testing to download a fresh
> run.py on each run to ensure I always have all the latest code.
>

  That's basically pointless, since run.py is
just a thin wrapper that downloads the real
script.

In Christ,
Steven Watanabe

_______________________________________________
Boost-Testing mailing list
[hidden email]
http://lists.boost.org/mailman/listinfo.cgi/boost-testing
Reply | Threaded
Open this post in threaded view
|

Re: Additional testing prerequisites

Rene Rivera-2


On Wed, Oct 26, 2016 at 9:42 PM, Steven Watanabe <[hidden email]> wrote:
AMDG

On 10/26/2016 08:31 PM, Rene Rivera wrote:
> On Wed, Oct 26, 2016 at 9:18 PM, Steve M. Robbins <[hidden email]> wrote:
>
>>
>> Finally: I should confess that I haven't touched my configuration in
>> months,
>> possibly a year or more.  I use wget to obtain run.py from github [2].
>> Maybe
>> that is out of date now?
>>
>
> Should be OK. But in the past I've set up testing to download a fresh
> run.py on each run to ensure I always have all the latest code.
>

  That's basically pointless, since run.py is
just a thin wrapper that downloads the real
script.

Yes, it's a wrapper.. But I have changed it in the past :-)

--
-- Rene Rivera
-- Grafik - Don't Assume Anything
-- Robot Dreams - http://robot-dreams.net
-- rrivera/acm.org (msn) - grafikrobot/aim,yahoo,skype,efnet,gmail

_______________________________________________
Boost-Testing mailing list
[hidden email]
http://lists.boost.org/mailman/listinfo.cgi/boost-testing
Reply | Threaded
Open this post in threaded view
|

Re: Additional testing prerequisites

Stefan Seefeld-2
In reply to this post by Rene Rivera-2
On 26.10.2016 22:31, Rene Rivera wrote:

> On Wed, Oct 26, 2016 at 9:18 PM, Steve M. Robbins <[hidden email]
> <mailto:[hidden email]>> wrote:
>
>     On Wednesday, October 26, 2016 10:09:37 AM CDT Rene Rivera wrote:
>     > On Wed, Oct 26, 2016 at 9:43 AM, Steve M. Robbins
>     <[hidden email] <mailto:[hidden email]>> wrote:
>     > > On Wed, Oct 26, 2016 at 07:47:25AM -0400, Stefan Seefeld wrote:
>     > > > On 25.10.2016 23:33, Steve M. Robbins wrote:
>     > > > > I've no clue why this won't show up on the web page.
>     > > >
>     > > > It seems there is some bug in the results (post-)processing
>     preventing
>     > > > these tests from showing up.
>     > >
>     > > Right.  The odd thing is, though: the results do show up for
>     certain
>     > > testers.  Just not mine :-(
>     >
>     > Now I'm confused.. Which tests aren't showing up that should?
>     Which are the
>     > numpy tests? What should they look like when working correctly?
>
>
> I fixed the problem a few hours ago. So now we just need to wait for
> reports to stabilize.

yup, things look good now; thanks for the fix Rene !

>  
>
>     I'm looking at the develop summary for python [1] and if you look
>     down the
>     left side, there is a row labelled "numpy".    If you look across
>     this row,
>     you will see most of it is blank -- including my tester "Debian
>     Sid", despite
>     reporting that the numpy tests run -- but a few of them are
>     labelled "pass".
>     The passes seem to correlate with non-gcc builds (clang or intel
>     compiler).
>
>     Yesterday there was just one "numpy" row, now there is "numpy" and
>
>
> That numpy row is the problem. It shouldn't be there, afaict.

No, that's good: there used to be a "numpy" test (which is obsoleted by
the new "numpy/*" tests), and I suppose that line will show up in the
matrix as long as at least one test run still references the old test,
i.e. isn't superseded by a new run.


>  
>
>     "numpy~dtype", "numpy~indexing" ... the latter are also mostly blank.
>
>
> Those are from the fix. And are the new tests.

Right.

>  
>
>     Finally: I should confess that I haven't touched my configuration
>     in months,
>     possibly a year or more.  I use wget to obtain run.py from github
>     [2].  Maybe
>     that is out of date now?
>
>
> Should be OK. But in the past I've set up testing to download a fresh
> run.py on each run to ensure I always have all the latest code.

I think things are good now, and the white spots only reflect the fact
that there are test runs from before and from after the switch from the
single "numpy" test to the multiple "numpy~*" tests.

The only problem left is the execution of the `python run.py` command
with new git versions (2.7.4 in my case) that cause the error I reported
already.

Thanks again for fixing the above,

        Stefan

--

      ...ich hab' noch einen Koffer in Berlin...

_______________________________________________
Boost-Testing mailing list
[hidden email]
http://lists.boost.org/mailman/listinfo.cgi/boost-testing