Details

    • Type: New Feature New Feature
    • Status: Closed Closed
    • Priority: Major Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: PYTHON-1.1
    • Component/s: Python
    • Labels:
      None
    • Number of attachments :
      0

      Description

      Provide metrics considering the execution of unit test, like number of errors/failures, success rate etc.

      Design
      ------
      There is a lot of variety when it comes to running automated test for Python. That variety is caused by differences in source layout, environment setup, used test framework etc.
      Long story short: We are outside of the java/maven universe, we cannot make strong assumptions about projects layout etc. and thus cannot provide a fully automated solution in the plugin. Instead, we leave the aspect that varies – calling the tests and capturing the output – on the project's site, where it belongs to. The plugin is only responsible for parsing the report and feeding the data into Sonar. We choose JUnitReport XML format because of his popularity.

      The format has some drawbacks though, most notably:
      1. it assumes that each function(=method) belongs to a class (Its originates from a "J"-tool, after all)
      2. it doesnt explicitly provide the path to the source file (That hurts in every environment where you cannot definitely map the class name to a source file)

      Despite this drawbacks it should be a goal to not move away from this (quasi)standard format. The plan for dealing with the second drawback is:
      a) use the knowledge of the lexer/parser to map the class names to according source files. That should work in most cases.
      b) as a fallback, provide the possibility to inject the path to the source file via an OPTIONAL 'source' attribute of the 'testcase' tag.
      c) If the source file cannot be found using both ways, just create a 'virtual' one with the content "sources cannot be found" or similar.
      d) To be precise about our format expectations, have a reference to a grammar in the dox which can be used to verify the validity of a report.

        Activity

        Hide
        Evgeny Mandrikov added a comment -

        For the record: nose is one of the tools, which is in use.

        Show
        Evgeny Mandrikov added a comment - For the record: nose is one of the tools, which is in use .
        Hide
        Evgeny Mandrikov added a comment -

        Also see https://github.com/cmheisel/nose-xcover which is an extension to nose and allows to produce coverage report in Cobertura style.

        Show
        Evgeny Mandrikov added a comment - Also see https://github.com/cmheisel/nose-xcover which is an extension to nose and allows to produce coverage report in Cobertura style.
        Hide
        Evgeny Mandrikov added a comment -

        Hi Waleri, why this issue still marked as "in progress" whereas code was committed?

        Show
        Evgeny Mandrikov added a comment - Hi Waleri, why this issue still marked as "in progress" whereas code was committed?
        Hide
        Waleri Enns added a comment -

        Cause its not finished yet. The part of locating the resources is still in work. Will finish it this or next week, I guess.
        FYI: this is then the last peace which was on my sheet for the next version of the plugin.

        Show
        Waleri Enns added a comment - Cause its not finished yet. The part of locating the resources is still in work. Will finish it this or next week, I guess. FYI: this is then the last peace which was on my sheet for the next version of the plugin.
        Hide
        Waleri Enns added a comment -

        resolved with commit 774afc68c4c796ef390937740c59b770d3eba06f
        The extension of the report format by the optional 'path' field has been left out: lets see if the need arises in practice.

        Show
        Waleri Enns added a comment - resolved with commit 774afc68c4c796ef390937740c59b770d3eba06f The extension of the report format by the optional 'path' field has been left out: lets see if the need arises in practice.
        Hide
        Freddy Mallet added a comment -

        @Waleri, just to be sure of my understanding , Am I right by saying :

        • It's up to the Sonar user to execute Python unit tests
        • It's up to the Sonar user to generate a unit test execution report complying with JUnit report XML format (whatever tool is used to do that)
        • The property "sonar.python.xunit.reportPath" must be define to provide the location of this unit test execution report
        Show
        Freddy Mallet added a comment - @Waleri, just to be sure of my understanding , Am I right by saying : It's up to the Sonar user to execute Python unit tests It's up to the Sonar user to generate a unit test execution report complying with JUnit report XML format (whatever tool is used to do that) The property "sonar.python.xunit.reportPath" must be define to provide the location of this unit test execution report
        Hide
        Waleri Enns added a comment -

        Yes, thats correct.

        Show
        Waleri Enns added a comment - Yes, thats correct.

          People

          • Assignee:
            Waleri Enns
            Reporter:
            Waleri Enns
          • Votes:
            2 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved: