Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Next revision
Previous revision
exam_reports_resultsexport [2017/03/15 11:01]
127.0.0.1 external edit
exam_reports_resultsexport [2017/03/16 15:26] (current)
steveclarke [Running the Report]
Line 1: Line 1:
-===== Exam Reports - Document Comparison ​===== +===== Exam Reports - Results Export ​===== 
-The Document Comparison report ​enables the user to compare [[item_entry|Items]] on the [[exam_entry|Exam]] for similarity in text to other Items on the Exam.+The Results Export ​enables the user to view various combinations of binomial results, points awarded per Item, Candidate responses and scores at Exam, Section or Competency level 
  
-Right click a row in the Exam grid and choose the **Reports → Document Comparison**.  ​+Right click a row in the Exam grid and choose the **Reports → Results Export**.  ​
  
 {{:​examentrycontextreports.png?​600|}}  ​ {{:​examentrycontextreports.png?​600|}}  ​
  
-This will navigate to the Document Comparison ​page.+This will navigate to the Results Export ​page. 
 +{{:​exam_reports_resultsexport.png?​600|}}
  
-{{:​exam_reports_documentcomparison.png?600|}}+==== Report Types ==== 
 +=== Binomial === 
 +^Column^Description^ 
 +|ID|The Registrant'​s Display ID| 
 +|Name|The Registrant'​s last name.| 
 +|Gender|Registrant'​s [[gender_entry|Gender]] code.| 
 +|Ethnicity|Registrant'​s [[ethnicity_entry|Ethnicity]] code.| 
 +|Education|Registrant'​s education code (Custom 04).| 
 +|B1..Bx|Binomial value for each Item on the Exam.  Binomial is 0 if Registrant answered incorrectly,​ 1 if Registrant answered correctly.|
  
-=== Correlation Factor ​=== +=== Response Binomial ​=== 
-The user must enter a correlation factor as a value between ​and 1.  The correlation factors sets a threshold of how similar ​the text of the Items must be in order to be listed The higher ​the number, the more similar ​the Item'​s ​text has to be. The correlation algorithm counts all the words in each Items text (omitting words like "​the",​ "​and",​ "​or"​.  ​It then looks at how many of those words exist in both Items' text The ratio of common words to total words must exceed the correlation factor ​in order to be listed +^Column^Description^ 
 +|ID|The Registrant'​s Display ID| 
 +|Name|The Registrant'​s last name.| 
 +|Gender|Registrant'​s [[gender_entry|Gender]] code.| 
 +|Ethnicity|Registrant'​s [[ethnicity_entry|Ethnicity]] code.| 
 +|Education|Registrant'​s education code (Custom 04).| 
 +|R1..Rx|Registrant response for each Item on the Exam.| 
 +|B1..Bx|Binomial ​value for each Item on the Exam.  Binomial is if Registrant answered incorrectly, ​if Registrant answered correctly.
 + 
 +=== Response Competency === 
 +|Candidate ID|The Registrant'​s Display ID| 
 +|Last Name|The Registrant'​s last name.| 
 +|Form Written|Registrant'​s Form ID (Custom 01).| 
 +|Responses|Registrant response for each Item on the Exam, combined in one column.| 
 +|Raw Score|Registrant'​s count of all correct responses on the Exam.| 
 +|Competency Scores|Each Registrant competency score which is the count of correct responses across all Items on the exam that are linked ​to that Competency.
 + 
 +=== Response Result === 
 +^Column^Description^ 
 +|ID|The Registrant'​s Display ID| 
 +|Name|The Registrant'​s last name.| 
 +|Gender|Registrant'​s [[gender_entry|Gender]] code.| 
 +|Ethnicity|Registrant'​s [[ethnicity_entry|Ethnicity]] code.| 
 +|Education|Registrant'​s education code (Custom 04).| 
 +|TScore|Registrant [[stats_concepts#​Alternate Scores|TScore]] on the exam.| 
 +|Converted Score|Registrant [[stats_candidate|STScore]] on the exam.| 
 +|Subscale1..Subscalex|Registrant [[stats_section|TScore]] by section.| 
 +|R1..Rx|Registrant response for each Item on the Exam.| 
 + 
 +=== TR Score === 
 +|Display ID|The Registrant'​s ​Display ID| 
 +|Last Name|The Registrant'​s last name.
 +|First Name|The Registrant'​s first name.| 
 +|Middle Name|The Registrant'​s middle name.| 
 +|Gender|Registrant'​s [[gender_entry|Gender]] code.| 
 +|Ethnicity|Registrant'​s [[ethnicity_entry|Ethnicity]] code.| 
 +|%|Registrant'​s raw % of Items answered correctly on exam.  ​Meaning if they answered 35 of 50 Items value would be 70.
 +|Section1..Sectionx|Registrants count of correct responses for all Items in each section of the Exam.
  
-Example: "This is a help file." ​ "This is a help document." ​ The words "​is"​ and "​a"​ will be removed leaving 3 words to compare. ​ In this case 2 of the 3 words are the same so these Items would be .67 correlated. ​ If the report correlation factor is less than .67 then this Item comparison would be listed. 
  
-=== Language === 
-Item writing supports multiple languages of Item text.  Only one language of text can be compared at a time.  Use the language drop down selector to choose which language to compare. 
  
 ==== Running the Report ==== ==== Running the Report ====
-To run the report ​click the **Run Report** button. ​ This will open report window.  The user can print or export the report using the toolbar.  ​To close the report, click the X in the top right of the report window.+To run the report
 +  - Select a Report type using the Report drop down selector (in green). 
 +  - Select a [[stats_concepts#​stats result set|Stats Result]] set from the selection grid (in yellow). 
 +  - Click the **Run Report** button. ​ This will navigate to processing pages with progress bars.  ​ 
 + 
 +The export ​can be cancelled by clicking ​the **Request Cancel** button.  ​Otherwise when the report ​completes a spreadsheet file will be created. ​ The export file can be downloaded using the **Download Result** button.
  
-{{:exam_reports_documentcomparisonreport.png?600|}}+{{:exam_reports_resultsexportcomplete.png?600|}}
  
 ==== Return to Exam Entry ==== ==== Return to Exam Entry ====