SDA SE Wiki

Software Engineering for Smart Data Analytics & Smart Data Analytics for Software Engineering

User Tools

Site Tools


Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
research:dpd:dpjf:researchers [2012/02/10 18:11]
alex.binun
research:dpd:dpjf:researchers [2018/05/09 01:59] (current)
Line 1: Line 1:
 +
 +====== DPJF: DPD for Researchers ======
 +
 +If you are interested in reports on the speed of a DPD run
 +(and on the precision and recall of runs on projects for which 
 +previously validated results are available) you can
 +  - Set the directory in which pattern detection results should be stored by running in the Prolog Console the query //​setOutputFolder(**Path**).// ​
 +    * ++Details|: **Path** is a full path to an existing local directory. ++
 +    * ++Example|: //​start_dpd('​C:/​dpdres'​)//​ runs the detection process and stores the results into the folder **C:/​dpdres**.++
 +    * **Attention!** If you do not set the output folder explicitly, the results will be stored into the folder **resultFolder** that resides within the directory of your project.
 +  - Run the desired detectors, as explained in the [[engineers| DPD for Software Engineers]] section. ​
 +  - Go to the results folder set in step 1. It contains subfolders named **//​projectName//​-results-//​date//​-//​time//​**. ​
 +    * ++Example|: The results for the "Java IO" project generated on Nov 18, 2011 at 12:35:22 are stored in the subfolder **javaio-results-18.11.2011-12.35.22**.++ ​
 +  - Each subfolder contains the following files:
 +    * **The "​dpjf-candidates.pl"​ file** contains DP candidates generated by DPJF for a given repository.
 +      * ++Details|: Candidates are represented by Prolog facts of the form **candidate(//​DPName//,​ //Score//, //​RoleAssignments//​)**. Thus DPJF output can be analyzed using simple Prolog queries. ++
 +      * ++Example|: [[:​research:​dpd:​dpjf:​candidates|A sample Observer candidate found by DPJF in JHotDraw 5.1]]++
 +    * **Each "​statistics-//​patterngroup//​.csv"​ file** (++Example|:​ "​statistics-decorators_cors_proxies.csv"​++) ​
 +      contains the response time needed to detect patterns within the respective similarity group.  ​
 +    * **The "​dpjf-evaluation-accuracies.txt"​ file** contains the accuracies for each individual pattern.
 +      * ++Details|: Accuracy is expressed by Prolog facts of the form **accuracy(//​DPName//,​ //​Precision//,​ //​Recall//​)**. If the number of reported candidates of a given pattern is 0, **Precision=1000**. Similarly, if no instances of a given pattern were found manually, **Recall=1000**.++ ​
 +      * ++Example|: The fact **accuracy(decorator,​ 1, 0.9)** in the folder **jhd60-results-//​date//​-//​time//​** means that Decorators are detected in JHotDraw 6.0 with precision 100% and recall 90%.++ ​
 +
 +DPJF computes accuracies only for projects for which we have control sets. 
 +Currently, these are our [[#​benchmark projects]]((We are working on interfacing DPJF 
 +to DBP, to take advantage of the control sets stored there)).
 +
 +Note that DPJF can nevertheless [[:​research:​dpd:​dpjf:​addproject|detect patterns in arbitrary Java projects]] (without computing accuracies). ​
 +
  
research/dpd/dpjf/researchers.txt · Last modified: 2018/05/09 01:59 (external edit)

SEWiki, © 2020