OMP: pid 25649 tid 25649 thread 0 bound to OS proc set {0}
OMP: pid 25649 tid 26117 thread 64 bound to OS proc set {64}
OMP: pid 25649 tid 26120 thread 67 bound to OS proc set {67}
OMP: pid 25649 tid 26119 thread 66 bound to OS proc set {66}
OMP: pid 25649 tid 26118 thread 65 bound to OS proc set {65}
OMP: pid 25649 tid 26054 thread 1 bound to OS proc set {1}
OMP: pid 25649 tid 26065 thread 12 bound to OS proc set {12}
OMP: pid 25649 tid 26061 thread 8 bound to OS proc set {8}
OMP: pid 25649 tid 26059 thread 6 bound to OS proc set {6}
OMP: pid 25649 tid 26129 thread 76 bound to OS proc set {76}
OMP: pid 25649 tid 26132 thread 79 bound to OS proc set {79}
OMP: pid 25649 tid 26121 thread 68 bound to OS proc set {68}
OMP: pid 25649 tid 26068 thread 15 bound to OS proc set {15}
OMP: pid 25649 tid 26101 thread 48 bound to OS proc set {48}
OMP: pid 25649 tid 26104 thread 51 bound to OS proc set {51}
OMP: pid 25649 tid 26131 thread 78 bound to OS proc set {78}
OMP: pid 25649 tid 26069 thread 16 bound to OS proc set {16}
OMP: pid 25649 tid 26130 thread 77 bound to OS proc set {77}
OMP: pid 25649 tid 26125 thread 72 bound to OS proc set {72}
OMP: pid 25649 tid 26133 thread 80 bound to OS proc set {80}
OMP: pid 25649 tid 26136 thread 83 bound to OS proc set {83}
OMP: pid 25649 tid 26123 thread 70 bound to OS proc set {70}
OMP: pid 25649 tid 26134 thread 81 bound to OS proc set {81}
OMP: pid 25649 tid 26137 thread 84 bound to OS proc set {84}
OMP: pid 25649 tid 26085 thread 32 bound to OS proc set {32}
OMP: pid 25649 tid 26105 thread 52 bound to OS proc set {52}
OMP: pid 25649 tid 26064 thread 11 bound to OS proc set {11}
OMP: pid 25649 tid 26127 thread 74 bound to OS proc set {74}
OMP: pid 25649 tid 26124 thread 71 bound to OS proc set {71}
OMP: pid 25649 tid 26089 thread 36 bound to OS proc set {36}
OMP: pid 25649 tid 26135 thread 82 bound to OS proc set {82}
OMP: pid 25649 tid 26103 thread 50 bound to OS proc set {50}
OMP: pid 25649 tid 26081 thread 28 bound to OS proc set {28}
OMP: pid 25649 tid 26077 thread 24 bound to OS proc set {24}
OMP: pid 25649 tid 26088 thread 35 bound to OS proc set {35}
OMP: pid 25649 tid 26122 thread 69 bound to OS proc set {69}
OMP: pid 25649 tid 26145 thread 92 bound to OS proc set {92}
OMP: pid 25649 tid 26141 thread 88 bound to OS proc set {88}
OMP: pid 25649 tid 26144 thread 91 bound to OS proc set {91}
OMP: pid 25649 tid 26147 thread 94 bound to OS proc set {94}
OMP: pid 25649 tid 26128 thread 75 bound to OS proc set {75}
OMP: pid 25649 tid 26102 thread 49 bound to OS proc set {49}
OMP: pid 25649 tid 26126 thread 73 bound to OS proc set {73}
OMP: pid 25649 tid 26093 thread 40 bound to OS proc set {40}
OMP: pid 25649 tid 26097 thread 44 bound to OS proc set {44}
OMP: pid 25649 tid 26095 thread 42 bound to OS proc set {42}
OMP: pid 25649 tid 26090 thread 37 bound to OS proc set {37}
OMP: pid 25649 tid 26109 thread 56 bound to OS proc set {56}
OMP: pid 25649 tid 26113 thread 60 bound to OS proc set {60}
OMP: pid 25649 tid 26078 thread 25 bound to OS proc set {25}
OMP: pid 25649 tid 26112 thread 59 bound to OS proc set {59}
OMP: pid 25649 tid 26063 thread 10 bound to OS proc set {10}
OMP: pid 25649 tid 26056 thread 3 bound to OS proc set {3}
OMP: pid 25649 tid 26073 thread 20 bound to OS proc set {20}
OMP: pid 25649 tid 26108 thread 55 bound to OS proc set {55}
OMP: pid 25649 tid 26055 thread 2 bound to OS proc set {2}
OMP: pid 25649 tid 26067 thread 14 bound to OS proc set {14}
OMP: pid 25649 tid 26058 thread 5 bound to OS proc set {5}
OMP: pid 25649 tid 26139 thread 86 bound to OS proc set {86}
OMP: pid 25649 tid 26072 thread 19 bound to OS proc set {19}
OMP: pid 25649 tid 26146 thread 93 bound to OS proc set {93}
OMP: pid 25649 tid 26098 thread 45 bound to OS proc set {45}
OMP: pid 25649 tid 26060 thread 7 bound to OS proc set {7}
OMP: pid 25649 tid 26138 thread 85 bound to OS proc set {85}
OMP: pid 25649 tid 26143 thread 90 bound to OS proc set {90}
OMP: pid 25649 tid 26106 thread 53 bound to OS proc set {53}
OMP: pid 25649 tid 26116 thread 63 bound to OS proc set {63}
OMP: pid 25649 tid 26115 thread 62 bound to OS proc set {62}
OMP: pid 25649 tid 26084 thread 31 bound to OS proc set {31}
OMP: pid 25649 tid 26080 thread 27 bound to OS proc set {27}
OMP: pid 25649 tid 26079 thread 26 bound to OS proc set {26}
OMP: pid 25649 tid 26096 thread 43 bound to OS proc set {43}
OMP: pid 25649 tid 26087 thread 34 bound to OS proc set {34}
OMP: pid 25649 tid 26092 thread 39 bound to OS proc set {39}
OMP: pid 25649 tid 26094 thread 41 bound to OS proc set {41}
OMP: pid 25649 tid 26076 thread 23 bound to OS proc set {23}
OMP: pid 25649 tid 26071 thread 18 bound to OS proc set {18}
OMP: pid 25649 tid 26062 thread 9 bound to OS proc set {9}
OMP: pid 25649 tid 26140 thread 87 bound to OS proc set {87}
OMP: pid 25649 tid 26111 thread 58 bound to OS proc set {58}
OMP: pid 25649 tid 26100 thread 47 bound to OS proc set {47}
OMP: pid 25649 tid 26075 thread 22 bound to OS proc set {22}
OMP: pid 25649 tid 26083 thread 30 bound to OS proc set {30}
OMP: pid 25649 tid 26148 thread 95 bound to OS proc set {95}
OMP: pid 25649 tid 26057 thread 4 bound to OS proc set {4}
OMP: pid 25649 tid 26107 thread 54 bound to OS proc set {54}
OMP: pid 25649 tid 26074 thread 21 bound to OS proc set {21}
OMP: pid 25649 tid 26070 thread 17 bound to OS proc set {17}
OMP: pid 25649 tid 26086 thread 33 bound to OS proc set {33}
OMP: pid 25649 tid 26114 thread 61 bound to OS proc set {61}
OMP: pid 25649 tid 26142 thread 89 bound to OS proc set {89}
OMP: pid 25649 tid 26066 thread 13 bound to OS proc set {13}
OMP: pid 25649 tid 26091 thread 38 bound to OS proc set {38}
OMP: pid 25649 tid 26099 thread 46 bound to OS proc set {46}
OMP: pid 25649 tid 26082 thread 29 bound to OS proc set {29}
OMP: pid 25649 tid 26110 thread 57 bound to OS proc set {57}
-------------------------------------------------------------
STREAM version $Revision: 5.10 $
-------------------------------------------------------------
This system uses 8 bytes per array element.
-------------------------------------------------------------
Array size = 860160000 (elements), Offset = 0 (elements)
Memory per array = 6562.5 MiB (= 6.4 GiB).
Total memory required = 19687.5 MiB (= 19.2 GiB).
Each kernel will be executed 100 times.
The *best* time for each kernel (excluding the first iteration)
will be used to compute the reported bandwidth.
-------------------------------------------------------------
Number of Threads requested = 96
Number of Threads counted = 96
-------------------------------------------------------------
Your clock granularity/precision appears to be 1 microseconds.
Each test below will take on the order of 28532 microseconds.
(= 28532 clock ticks)
Increase the size of the arrays if this shows that
you are not getting at least 20 clock ticks per test.
-------------------------------------------------------------
WARNING -- The above is only a rough guideline.
For best results, please be sure you know the
precision of your system timer.
-------------------------------------------------------------
Function Best Rate MB/s Avg time Min time Max time
Copy: inf 0.000000 0.000000 0.000001
Scale: inf 0.000000 0.000000 0.000001
Add: 377851.4 0.054683 0.054635 0.054760
Triad: inf 0.000000 0.000000 0.000001
-------------------------------------------------------------
-------------------------------------------------------------
Your experiment path is /beegfs/hackathon/users/eoseret/qaas_runs_test/gmz17.benchmarkcenter.megware.com/177-365-3092/stream-add/run/oneview_runs/compilers/aocc_5/oneview_results_1773653780/tools/lprof_run_0
To display your profiling results:
###################################################################################################################################################################################################################################################
# LEVEL | REPORT | COMMAND #
###################################################################################################################################################################################################################################################
# Functions | Cluster-wide | maqao lprof -df xp=/beegfs/hackathon/users/eoseret/qaas_runs_test/gmz17.benchmarkcenter.megware.com/177-365-3092/stream-add/run/oneview_runs/compilers/aocc_5/oneview_results_1773653780/tools/lprof_run_0 #
# Functions | Per-node | maqao lprof -df -dn xp=/beegfs/hackathon/users/eoseret/qaas_runs_test/gmz17.benchmarkcenter.megware.com/177-365-3092/stream-add/run/oneview_runs/compilers/aocc_5/oneview_results_1773653780/tools/lprof_run_0 #
# Functions | Per-process | maqao lprof -df -dp xp=/beegfs/hackathon/users/eoseret/qaas_runs_test/gmz17.benchmarkcenter.megware.com/177-365-3092/stream-add/run/oneview_runs/compilers/aocc_5/oneview_results_1773653780/tools/lprof_run_0 #
# Functions | Per-thread | maqao lprof -df -dt xp=/beegfs/hackathon/users/eoseret/qaas_runs_test/gmz17.benchmarkcenter.megware.com/177-365-3092/stream-add/run/oneview_runs/compilers/aocc_5/oneview_results_1773653780/tools/lprof_run_0 #
# Loops | Cluster-wide | maqao lprof -dl xp=/beegfs/hackathon/users/eoseret/qaas_runs_test/gmz17.benchmarkcenter.megware.com/177-365-3092/stream-add/run/oneview_runs/compilers/aocc_5/oneview_results_1773653780/tools/lprof_run_0 #
# Loops | Per-node | maqao lprof -dl -dn xp=/beegfs/hackathon/users/eoseret/qaas_runs_test/gmz17.benchmarkcenter.megware.com/177-365-3092/stream-add/run/oneview_runs/compilers/aocc_5/oneview_results_1773653780/tools/lprof_run_0 #
# Loops | Per-process | maqao lprof -dl -dp xp=/beegfs/hackathon/users/eoseret/qaas_runs_test/gmz17.benchmarkcenter.megware.com/177-365-3092/stream-add/run/oneview_runs/compilers/aocc_5/oneview_results_1773653780/tools/lprof_run_0 #
# Loops | Per-thread | maqao lprof -dl -dt xp=/beegfs/hackathon/users/eoseret/qaas_runs_test/gmz17.benchmarkcenter.megware.com/177-365-3092/stream-add/run/oneview_runs/compilers/aocc_5/oneview_results_1773653780/tools/lprof_run_0 #
###################################################################################################################################################################################################################################################