Nantes Université

Skip to content
Extraits de code Groupes Projets
Valider 2b62a5fb rédigé par rlaz's avatar rlaz
Parcourir les fichiers

Small updates

parent 49ae8f21
Aucune branche associée trouvée
Aucune étiquette associée trouvée
Aucune requête de fusion associée trouvée
...@@ -165,7 +165,6 @@ echo "Evaluating..." ...@@ -165,7 +165,6 @@ echo "Evaluating..."
INDEX=0 INDEX=0
for file in $TARGETS for file in $TARGETS
do do
FNAME=`basename $file .lg` FNAME=`basename $file .lg`
nextFile="_ERROR_" nextFile="_ERROR_"
if [ $MODE == "Dir" ] if [ $MODE == "Dir" ]
...@@ -256,9 +255,12 @@ else ...@@ -256,9 +255,12 @@ else
fi fi
# Compute summaries # Compute summaries
python3 $LgEvalDir/src/sumMetric.py "$LABEL_STRING" $ResultsDir/$BNAME.csv > $ResultsDir/Summary.txt python3 $LgEvalDir/src/sumMetric.py "$LABEL_STRING" $ResultsDir/$BNAME.csv > \
python3 $LgEvalDir/src/sumDiff.py $ResultsDir/$BNAME.diff $ResultsDir/labelsGT.txt html > $ResultsDir/ConfusionMatrices.html $ResultsDir/Summary.txt
python3 $LgEvalDir/src/sumDiff.py $ResultsDir/$BNAME.diff $ResultsDir/labelsGT.txt > $ResultsDir/ConfusionMatrices.csv python3 $LgEvalDir/src/sumDiff.py $ResultsDir/$BNAME.diff $ResultsDir/labelsGT.txt html > \
$ResultsDir/ConfusionMatrices.html
python3 $LgEvalDir/src/sumDiff.py $ResultsDir/$BNAME.diff $ResultsDir/labelsGT.txt > \
$ResultsDir/ConfusionMatrices.csv
################################################################ ################################################################
...@@ -269,12 +271,12 @@ python3 $LgEvalDir/src/sumDiff.py $ResultsDir/$BNAME.diff $ResultsDir/labelsGT.t ...@@ -269,12 +271,12 @@ python3 $LgEvalDir/src/sumDiff.py $ResultsDir/$BNAME.diff $ResultsDir/labelsGT.t
# Use awk and head to select every odd (headers) and even (data) columns, # Use awk and head to select every odd (headers) and even (data) columns,
# Concatenate one header row with data contents. # Concatenate one header row with data contents.
awk -F',' '{ for (i=1;i<=NF;i+=2) printf ("%s%c", $i, i + 2 <= NF ? "," : "\n")}' $ResultsDir/$BNAME.csv > $ResultsDir/Headers.csv awk -F',' '{ for (i=1;i<=NF;i+=2) printf ("%s%c", $i, i + 2 <= NF ? "," : "\n")}' $ResultsDir/$BNAME.csv > $ResultsDir/Headers.csv
awk -F',' '{ for (i=2;i<=NF;i+=2) printf ("%s%c", $i, i + 2 <= NF ? "," : "\n")}' $ResultsDir/$BNAME.csv > $ResultsDir/Data.csv
# Obtain first row for data labels; insert a "File" label in the first column. # Obtain first row for data labels; insert a "File" label in the first column.
head -n 1 $ResultsDir/Headers.csv > $ResultsDir/HeaderRow.csv head -n 1 $ResultsDir/Headers.csv > $ResultsDir/HeaderRow.csv
HEAD=`cat $ResultsDir/HeaderRow.csv` HEAD=`cat $ResultsDir/HeaderRow.csv`
echo "File,Result,$HEAD" > $ResultsDir/HeaderRow.csv echo "File,Result,$HEAD" > $ResultsDir/HeaderRow.csv
awk -F',' '{ for (i=2;i<=NF;i+=2) printf ("%s%c", $i, i + 2 <= NF ? "," : "\n")}' $ResultsDir/$BNAME.csv > $ResultsDir/Data.csv
# Combine file names with raw data metrics, then add header labels. # Combine file names with raw data metrics, then add header labels.
paste -d , $ResultsDir/FileResults.csv $ResultsDir/Data.csv > $ResultsDir/DataNew.csv paste -d , $ResultsDir/FileResults.csv $ResultsDir/Data.csv > $ResultsDir/DataNew.csv
...@@ -285,7 +287,9 @@ cat $ResultsDir/HeaderRow.csv $ResultsDir/DataNew.csv > $ResultsDir/FileMetrics. ...@@ -285,7 +287,9 @@ cat $ResultsDir/HeaderRow.csv $ResultsDir/DataNew.csv > $ResultsDir/FileMetrics.
# Clean up # Clean up
################################## ##################################
rm -f $ResultsDir/Headers.csv $ResultsDir/HeaderRow.csv $ResultsDir/Data.csv rm -f $ResultsDir/Headers.csv $ResultsDir/HeaderRow.csv $ResultsDir/Data.csv
rm -f $ResultsDir/DataNew.csv $ResultsDir/FileResults.csv rm -f $ResultsDir/DataNew.csv
# RZ: not deleting FileResults, to insure that all files are present.
#rm -f $ResultsDir/FileResults.csv
rm -f $ResultsDir/$BNAME.csv $ResultsDir/$BNAME.diff rm -f $ResultsDir/$BNAME.csv $ResultsDir/$BNAME.diff
......
0% Chargement en cours ou .
You are about to add 0 people to the discussion. Proceed with caution.
Terminez d'abord l'édition de ce message.
Veuillez vous inscrire ou vous pour commenter