Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Precipitation Product error structure Silvia Puca, Emanuela Campione, Corrado DeRosa In collaboration with RMI (Belgium), BFG (Germany), OMSZ (Hungary),

Similar presentations


Presentation on theme: "The Precipitation Product error structure Silvia Puca, Emanuela Campione, Corrado DeRosa In collaboration with RMI (Belgium), BFG (Germany), OMSZ (Hungary),"— Presentation transcript:

1 The Precipitation Product error structure Silvia Puca, Emanuela Campione, Corrado DeRosa In collaboration with RMI (Belgium), BFG (Germany), OMSZ (Hungary), UniFe and DPC (Italy), IMWG (Poland), SHMI (Slovakia), ITU TMS (Turkey) Dipartimento della Protezione Civile Italiana

2 outlines Silvia Puca, Emanuela Campione, Corrado DeRosa In collaboration with RMI (Belgium), BFG (Germany), OMSZ (Hungary), UniFe and DPC (Italy), IMWG (Poland), SHMI (Slovakia), ITU TMS (Turkey) Dipartimento della Protezione Civile Italiana PP Validation group; data used; validation approach (Common and Institute Specific Validation); precipitation classes; statistical scores; Common Validation results; Validation Results publication (web-page); Next steps;

3 ‘Calibration and validation is a difficult activity in the case of precipitation, due to the natural space-time variability of the precipitation field and the problematic error structure of the ground truth measurements. ‘ Developer need: Any product has to be related to information on its error structure, necessary for its correct use in the application

4 Aims: To improve the accuracy and the applicability of the products delivered during the Development phase: – –supporting the calibration and algorithm tuning, – –generate the information on error structure to accompany the data, – –quantify improvements stemming from the progressive implementation of new developments. To monitor data quality and provide feedback for progressive quality improvement during the Operational phase. validation Product development calibration To assess the accuracy: Difference from the measured value and the “ground truth” Tuning of the algorithm to maximase the accuracy The calibration and validation activity will accompany all steps of the Development Phase and also will be routinely carried out during the Operational Phase:

5 outlines Silvia Puca, Emanuela Campione, Corrado DeRosa In collaboration with RMI (Belgium), BFG (Germany), OMSZ (Hungary), UniFe and DPC (Italy), IMWG (Poland), SHMI (Slovakia), ITU TMS (Turkey) Dipartimento della Protezione Civile Italiana PP Validation group; data used; validation approach (Common and Institute Specific Validation); precipitation classes; statistical scores; Common Validation results; Validation Results publication (web-page); Next steps;

6 WP-2300 Precipitation validation Italy (DPC) WP-2310 Philosophy DPC WP-2320 in Belgium IRM WP-2330 in Germany BfG WP-2340 in Hungary OMSZ WP-2350 in Italy UniFerrar a WP-2360 in Italy DPC WP-2370 in Poland IMWM WP-2380 in Slovakia SHMÚ WP-2390 in Turkey ITU WP-2300Silvia Puca (Team leader)silvia.puca@protezionecivile.it WP-2310Silvia Pucasilvia.puca@protezionecivile.it WP-2320 Emmanuel Roulin (+ Angelo Rinollo) emmanuel.roulin@oma.be (+ hans.vandevijver@oma.be) WP-2330 Peer Helmkehelmke@bafg.de WP-2340Eszter Lábólabo.e@met.hu WP-2350Federico Porcù porcu@fe.infn.it (+ a.rinolli@isac.cnr.it) WP-2360Silvia Pucasilvia.puca@protezionecivile.it WP-2370Bozena Lapetabozena.lapeta@imgw.pl WP-2380Ján Kaňákjan.kanak@shmu.sk WP-2390Ibrhaim Sonmez + Ahmet Öztopal oztopal@itu.edu.tr (+ zsen@itu.edu.tr) PP Validation group

7 PPV Rainauge network is composed by 4100 telemetric stations: PPV Raingauge network is composed by 4100 stations: Data Sourcesraingauges Instrument characteristics Telemetric and mechanic time domain (near real time/ case studies) Near real time, case studies time resolution (15 min, 30 min) 10 – 30 min (telemetric), 3 – 24 h (mechanic) spatial distribution (whole national territory/ limited area) Whole national territory number of station (please attach a map) ~390 mechanic (RMI) + 12 telemetric (RMI) + 4160 telemetric (SETHY) operational/ for research only Operational (RMI) + research (other networks) data quality check Telemetric: automatically checked / mechanic: autom. + manually checked

8 PPV Radar network is composed by 33 C-band and 1 Ka-band: PPV Radar network is composed by 33 C-band and 1 Ka-band: PPV Radar network is composed by 40 C-band and 1 Ka-band: We have now radars in Turkey Data Sourcesradars Instrument characteristics Beam width ~1°, max range ~150 Km, 250m, C-band, single polarization, Doppler polarimetric time domain near real time/ case studies time resolution 5 min, 15 min, 30 min, 1h, 24h spatial distribution Whole national territory number of station 33 C band +1 Ka band operational/ for research only Operational data quality check Permanent ground clutter removed; monitoring of electronic calibration

9 validation approach Two different versions of the PP OBS-2 have been developed by CNR: PR-OBS-2v1.0 based on neural network algorithm trained on radar data (NEXRAD) PR-OBS-2v2.0 based on neural network algorithm trained on numerical model (MM5) Two different versions of the PP OBS-2 have been developed by CNR: PR-OBS-2v1.0 based on neural network algorithm trained on radar data (NEXRAD) PR-OBS-2v2.0 based on neural network algorithm trained on numerical model (MM5) Two different versions of the PP OBS-2 have been developed by CNR: PR-OBS-2v1.0 based on neural network algorithm trained on radar data (NEXRAD) PR-OBS-2v2.0 based on neural network algorithm trained on numerical model (MM5) 1) 1)For the Common Validation activity all Institutes: - use rain gauges and/or radar data, - comparisons (sat vs obs) are evaluated on Satellite native grid: same up- scaling techniques ; - evaluate the same monthly statistical scores (Multi-categorical and Continuous statistics) for the defined precipitation classes; 2) In addition to the common validation each Institute has developed an Institute Specific Validation activity based on its own knowledge and experience: - case studies; - also lightning data, numerical weather prediction and nowcasting products;

10 Continuous verification statistics: calculating Mean absolute error, root mean square error, correlation coefficient, standard deviation.Continuous verification statistics: calculating Mean absolute error, root mean square error, correlation coefficient, standard deviation. Multi-Categorical statistics: calculating the contingency table (which allows for evaluation of false alarm rate, probability of detection, equitable threat score, Heidke skill score, etc ).Multi-Categorical statistics: calculating the contingency table (which allows for evaluation of false alarm rate, probability of detection, equitable threat score, Heidke skill score, etc ). The Common Validation is based on

11 Continuous This means that the statistics are calculated using the numeric value of the satellite precipitation estimation (SPE) and observation at each point.Categorical This means that the statistics are calculated from a contingency table, where each SPE-observation pair is tabulated in the appropriate precipitation classes. This results in a contingency table. Because most of the categorical scores are actually computed for "threshold" intervals (wherein an event occurrence means observed or SPE was equal to or greater than the threshold value), entries in the table are appropriately combined to form a 2x2 table for each threshold.

12 scores evaluated for multi-categorical and continuous statistics: CS statistic: - Mean error - Multiplicative bias - Mean absolute error - Root mean square error - correlation coefficient - Standard deviation MC statistic: – –ACCURACY – –POD – –FAR – –BIAS – –ETS Plots: - Scatter plot - Probability density function

13 Continuous Score Mean Absolute Error (MAE) This score is the mean of the absolute differences between the observations and PSE in the interval. The score provides a good measure of the accuracy. The closer the MAE is to zero the better the accuracy. Root Mean Square Error (RMSE) This score is the square root of the mean of the squared differences between the observations and SPE in the interval. The score provides a good measure of the accuracy while giving a greater weight to the larger differences than the MAE does. The closer the RMSE is to zero the better the accuracy. Mean Error (ME) (bias) This score is the mean of the arithmetic differences between the observations and SPE in the interval. The score is a measure of SPE bias, where positive values denote overforecasting, negative values denote underforecasting, and zero indicates no bias. Standard Deviation (StD) This score shows how much variation there is from the "average" (mean). It may be thought of as the average difference of the scores from the mean of distribution, how far they are away from the mean. A low standard deviation indicates that the data points tend to be very close to the mean, whereas high standard deviation indicates that the data are spread out over a large range of values. Correlation Coefficient This score is a good measure of linear association or phase error. Visually, the correlation measures how close the points of a scatter plot are to a straight line. Does not take SPE bias into account -- it is possible for a SPE with large errors to still have a good correlation coefficient with the observations. Sensitive to outliers.

14 Categorical Scores Equitable Threat Score (ETS) This score measures the fraction of observed and/or forecast events that were correctly predicted, adjusted for hits associated with random chance (for example, it is easier to correctly forecast rain occurrence in a wet climate than in a dry climate). Sensitive to hits. Because it penalises both misses and false alarms in the same way, it does not distinguish the source of SPE error. Probability of Detection (POD) This score is the fraction of the observed area of a threshold precipitation amount that was correctly forecast. A Satellite product with a perfect POD have a value of one, and forecast with the worst possible POD have a value of zero. False Alarm Rate (FAR) This score is the fraction of the forecast of a threshold precipitation amount that were incorrect. The worst is one the best is zero. Sensitive to false alarms, but ignores misses. Very sensitive to the climatological frequency of the event. Should be used in conjunction with the probability of detection Bias (Bias) This score is the ratio of the number of forecasts to the number of observations given the threshold amount. Forecast with perfect bias have a value of one, overforecasting results in bias greater than one, and underforecasting results in bias less than one. Accuracy (Acc) Simple, intuitive. Can be misleading since it is heavily influenced by the most common category, usually "no event" in the case of rare weather.

15 Plots Scatter plot - Plots the SPE values against the observed values. This score is good first look at correspondence between SPE and observations. An accurate SPEt will have points on or near the diagonal. Probability Density Function plot

16 The radar and rain gauge data were up-scaled taking into account that the product follows the scanning geometry and IFOV resolution of AMSU-B scan and SSMI. Radar and rain gauge instruments provide many measurements within a single AMSU-B pixel, those measurements were averaged following the AMSU-B antenna pattern shown and SSMI. All institutes involved in PP validation activity uses the same up-scaling technique which was indicated by CNR-ISAC. The codes were developed by University of Ferrara and RMI. The comparisons (Sat vs obs) on Satellite native grid: Up-scaling techniques

17 PRECIPITATION CLASSES for PR-OBS1, PR-OBS2 and PR-OBS3: Precipitation classes Validation Prec. Classes Class #PR [mm/h] 1PR<0.25 20.25≤PR<0.50 30.50≤PR<1.00 41.00≤PR<2.00 52.00≤PR<4.00 64.00≤PR< 8.00 7 10.00≤PR<16.00 816.00≤PR<32.00 932.00≤PR<64.00 1064.00≤PR URD Prec. Classes Class #PR [mm/h] 1PR<1 21≤PR<10 310≤PR PR-OBS1, PR-OBS2 and PR-OBS3: PR= PRECIPITATION RATE 0,25 mm/h is the threshold for precipitation/no-precipitation. Class #AP [mm] 1AP<1 21≤AP<2 32≤AP<4 44≤AP<8 58≤AP<16 616≤AP<32 732≤AP<64 864≤AP<128 9128≤AP<256 10 256≤AP PR-OBS5: 3, 6, 12 and 24 hours accumulated precipitation: AP= ACCUMULATED PRECIPITATION 1.00 mm is the threshold for precipitation/no-precipitation.

18 H01 continuous statistic: radar data and rain gauge LAND Period: September 2008 – June 2009 H02 New version

19 H01 continuous statistic: radar and rain gauge INDEX: H01 (mm/ h) MEAN ERROR (mm/h) 0,04 MULTI BIAS3,39 MEAN ABS. ERR (mm/h) 0,13 ROOT MEAN SQUARE ERROR (mm/h) 0,30 CORRELATION COEF.0,25 STANDARD DEVIATION ERROR (mm/h) 0,34 N RAD [mm/h] ME [mm/h] StD [mm/h] MAE [mm/h] RMSE [mm/h] URDrmse [%] N RG [mm/h] ME [mm/h] StD [mm/h] MAE [mm/h] RMSE [mm/h] URDrmse [%] 0.25≤PR<1163116-0.041.160.731.24251.731126350.051.600.861.62289.68 1≤PR<1069283-0.872.051.842.39117.3445549-0.423.012.163.22145.89 10.00≤PR602-10.946.1512.1413.0377.05261-12.165.6812.8613.7074.47 MEAN233001-0.311.441.091.61211.32 158445-0.122.031.272.13246.37 N RAD [mm/h] ME [mm/h] StD [mm/h] MAE [mm/h] RMSE [mm/h] URDrmse [%] N RG [mm/h] ME [mm/h] StD [mm/h] MAE [mm/h] RMSE [mm/h] URDrmse [%] 0.25≤PR<18797-0.410.230.480.51105.0917501-0.560.020.670.66109.83 1≤PR<103222-2.430.742.482.6698.906138-7.220.097.323.03147.13 10.00≤PR215-15.552.2515.5512.75100.001047-19.451.0519.4514.26104.95 MEAN12234-1.200.401.271.29103.37 24686-2.650.062.751.52119.57 Period: September 2008 – December 2008 Period: Jenuary 2009 – June 2009 There is an evident increase of the errors in the higher precipitation class; ME= Mean Error, SD=Standard Deviation, MAE =Mean Aboslute Error, RMSE= Root Mean Square Error; URD RMSE= Root Mean Square Rrror defined in URD doc.

20 POD(rain/Norain)0,95 FAR(rainNorain)0,04 H01: Multi-category statistic Good value of POD and FAR for rain/no-rain; Clear underestimation of the precipitation. PR<0.250.25<PRtot RD PR<0.250,960,04139695 0.25<PR0,820,187383 tot SAT1400497029147078 PR<0.250.25<PR<11<PR<1010<PRtot RD PR<0.250,9600,0230,0170,000139695 0.25<PR<10,8790,0520,0660,0035713 1<PR<100,6140,0730,3060,0071668 10<PR1,0000,000 2 tot SAT1400493601333494147078

21 coast/land analysis H01 University of Ferrara: F. Porcù

22 coast/land analysis H01 University of Ferrara: F. Porcù

23 H02 continuous statistic: radar data and rain gauge LAND Period: September 2008 – June 2009

24 H02 continuous statistic: radar and rain gauge INDEX: H01 (mm/ h) MEAN ERROR (mm/h) 0,04 MULTI BIAS3,39 MEAN ABS. ERR (mm/h) 0,13 ROOT MEAN SQUARE ERROR (mm/h) 0,30 CORRELATION COEF.0,25 STANDARD DEVIATION ERROR (mm/h) 0,34 Period: September 2008 – December 2008 Period: Jenuary 2009 – June 2009 There is an evident increase of the errors in the higher precipitation class; ME= Mean Error, SD=Standard Deviation, MAE =Mean Aboslute Error, RMSE= Root Mean Square Error; URD RMSE= Root Mean Square Rrror defined in URD doc. N RAD [mm/h] ME [mm/h] StD [mm/h] MAE [mm/h] RMSE [mm/h] URDrmse [%] N RAD [mm/h] ME [mm/h] StD [mm/h] MAE [mm/h] RMSE [mm/h] URDrmse [%] 0.25≤PR<1134299-0.150.820.581.00185.8969358-0.220.970.600.98217.35 1≤PR<1063441-0.881.731.752.33111.9927808-1.102.241.903.41149.45 10.00≤PR624-8.726.5911.4212.5469.59167-10.495.5811.4013.1181.68 MEDIA198364-0.411.130.991.46161.89 97333-0.541.391.051.62195.21 N RAD [mm/h] ME [mm/h] StD [mm/h] MAE [mm/h] RMSE [mm/h] URDrmse [%] N RAD [mm/h] ME [mm/h] StD [mm/h] MAE [mm/h] RMSE [mm/h] URDrmse [%] 0.25≤PR<131172-0.210.480.510.68135.3339683-0.440.850.590.95208.09 1≤PR<108440-0.571.121.321.6395.2443260-2.382.052.903.28106.03 10.00≤PR26-7.141.207.207.3262.505250-9.906.2913.7214.1191.50 MEDIA39638-0.290.620.690.89126.74 88193-0.861.111.091.45186.28

25 H02: Multi-category statistic H02: Validation Precipitation classes for LAND areas. Data used: RADAR. Period: 10 2008 PR<0.250.25<PR<11<PR<1010<PRtot RD PR<0.250,9890,0060,0030292709 0.25<PR<10,8180,0930,0860,0019097 1<PR<100,6260,1310,2370,0047601 10<PR0,4370,1580,3790,025709 tot SAT30229338213863139310116 H02: Validation Precipitation classes for LAND areas. Data used: RADAR. Period: 10 2008 PR<0.250.25<PRtot RD PR<0.250,9890,009292709 0.25<PR<10,7190,28117407 tot SAT3022937823310116 POD(rain/Norain)=0,958 FAR(rainNorain)=0,003 Good value of POD and FAR for rain/no-rain; Clear underestimation of the precipitation but more capacity to discriminate the precipitation than H01.

26 H03 continuous statistic: radar data and rain gauge LAND Period: September 2008 – June 2009

27 H03 continuous statistic: radar and rain gauge INDEX: H01 (mm/ h) MEAN ERROR (mm/h) 0,04 MULTI BIAS3,39 MEAN ABS. ERR (mm/h) 0,13 ROOT MEAN SQUARE ERROR (mm/h) 0,30 CORRELATION COEF.0,25 STANDARD DEVIATION ERROR (mm/h) 0,34 N RAD [mm/h] ME [mm/h] StD [mm/h] MAE [mm/h] RMSE [mm/h] URDrmse [%] N RAD [mm/h] ME [mm/h] StD [mm/h] MAE [mm/h] RMSE [mm/h] URDrmse [%] 0.25≤PR<131172-0.210.480.510.68135.3339683-0.440.850.590.95208.09 1≤PR<108440-0.571.121.321.6395.2443260-2.382.052.903.28106.03 10.00≤PR26-7.141.207.207.3262.505250-9.906.2913.7214.1191.50 MEDIA39638-0.290.620.690.89126.74 88193-0.861.111.091.45186.28 N RAD [mm/h] ME [mm/h] StD [mm/h] MAE [mm/h] RMSE [mm/h] URDrmse [%] N RAD [mm/h] ME [mm/h] StD [mm/h] MAE [mm/h] RMSE [mm/h] URDrmse [%] 0.25≤PR<1134299-0.150.820.581.00185.8969358-0.220.970.600.98217.35 1≤PR<1063441-0.881.731.752.33111.9927808-1.102.241.903.41149.45 10.00≤PR624-8.726.5911.4212.5469.59167-10.495.5811.4013.1181.68 MEDIA198364-0.411.130.991.46161.89 97333-0.541.391.051.62195.21 Period: September 2008 – December 2008 Period: Jenuary 2009 – June 2009 There is an evident increase of the errors in the higher precipitation class; ME= Mean Error, SD=Standard Deviation, MAE =Mean Aboslute Error, RMSE= Root Mean Square Error; URD RMSE= Root Mean Square Rrror defined in URD doc. N RAD [mm/h] ME [mm/h] StD [mm/h] MAE [mm/h] RMSE [mm/h] URDr mse [%] N RAD [mm/h] ME [mm/h] StD [mm/h] MAE [mm/h] RMSE [mm/h] URDrm se [%] 0.25≤PR<18374045-0.310.530.520.74177.65803197-1.180.011.330.94189.18 1≤PR<102547035-1.650.941.882.07105.93391313-13.220.0913.465.54204.57 10.00≤PR19063-15.943.9115.9616.5994.471147818-14.732.1014.7614.0697.49 MEAN10940143-0.650.630.871.08160.81 2342328-4.010.034.182.03192.61 N RAD [mm/h] ME [mm/h] StD [mm/h] MAE [mm/ h] RMSE [mm/h] URDrm se [%] N RAD [mm/h] ME [mm/h] StD [mm/h] MAE [mm/h] RMSE [mm/h] URDrm se [%] 0.25≤PR< 114767131-0.440.290.500.58108.4869358-0.430.350.530.60131.65 1≤PR<104178286-1.700.581.751.8596.4127808-1.560.681.681.76110.80 10.00≤PR15838-14.513.9417.2018.0595.05167-14.262.6914.6915.2183.15 MEAN18961255-0.730.360.790.87105.81 97333-0.690.420.790.87127.02

28 H03: Multi-category statistic PR<0.250.25<PRtot RD PR<0.250,9570,04051880730 0.25<PR<10,7230,2762324380 tot SAT51298933290617754205110 POD(rain/Norain)0,967 FAR(rainNorain)0,040 PR<0.250.25<PR<11<PR<1010<PRtot RD PR<0.250,9570,0260,0130,00151880730 0.25<PR<10,7400,1670,0880,0031727431 1<PR<100,6730,1530,1620,010588126 10<PR0,6470,0650,2730,0288823 tot SAT5129893318712939973183756654205110 Good value of POD and FAR for rain/no-rain; Clear underestimation of the precipitation but several cases of overestimation of precipitation area;

29 coast/land analysis H03 University of Ferrara: F. Porcù

30 coast/land analysis H03 University of Ferrara: F. Porcù

31 H05: Continuous statistic [mm]# RADARME[mm]SD[mm]MAE[mm]RMSE[mm]URDrmse AP< 813920871 -0,3601,3631,0631,609127% 8 ≤AP<323562429 -1,4183,0063,6804,39686% 32 ≤AP< 64162626 -5,1333,4656,0416,54686% 64 ≤AP< 128170027 -7,9484,3268,4089,25792% 128 ≤AP 25758 -14,424,99714,42115,30887% MEAN 17841711 -0,7071,7441,7212,303107% [mm] # RAIN GAUGEME[mm]SD[mm]MAE[mm]RMSE[mm]URDrmse AP< 84943648 489652,3-5,9270,4130,011469% 8 ≤AP<323029460 149317,7-48,7056,2860,0332410% 32 ≤AP< 64601979 37553,23-159,43256,8920,0146882% 64 ≤AP< 128170098 12010,45-356,61238,8980,06111699% 128 ≤AP 29611 7381,739-298,06652,9130,12923309% MEAN 8774796 838385,3-19,6314,4490,01610546% ME= Mean Error, SD=Standard Deviation, MAE =Mean Aboslute Error, RMSE= Root Mean Square Error; URD RMSE= Root Mean Square Rrror defined in URD doc. It is necessary a verification of rain gauge validation results!

32 Some conclusions All the PP were validated by comparison with both radar and rain gauge data by 7 countries, All the PP were validated by comparison with both radar and rain gauge data by 7 countries, Multi category and continuous statistical scores were evaluated; Multi category and continuous statistical scores were evaluated; All the statistical scores evaluated and the case studies analysed are available in AM ftp server; All the statistical scores evaluated and the case studies analysed are available in AM ftp server; *H01: -the majority of the precipitation is estimated less than 0.25 mm/h by H01; -there is a general under-estimation of the precipitation estimation; -No strong seasonal component is present; -there is an evident increase of the errors in the lower classes respect the previus version.

33 Some conclusions *H02: -There is a general underestimation but more capacity to discriminate precipitation greater than 0.25 mm/h; - Seasonal component is present; -there is an evident increase of the errors in the higher precipitation class; -problem with Noaa16:replacment of a channel. (noise effect) *H03: -There is a general underestimation of precipitation rate and an overestimation of precipitation area; -Seasonal component is present; -There is an evident increase of the errors in the higher precipitation class; -heavy convective precipitation events were underestimated; -Moderate and light convective precipitation events were often overestimated; *H05: -Same not realistic value of precipitation; -Not enough results;

34 Validation Results publication Silvia Puca, Emanuela Campione, Corrado DeRosa In collaboration with RMI (Belgium), BFG (Germany), OMSZ (Hungary), UniFe and DPC (Italy), IMWG (Poland), SHMI (Slovakia), ITU TMS (Turkey) Dipartimento della Protezione Civile Italiana Rep 3: collects all the results of the PP Validation activity: It is a rolling document. User Requirement Documents: summarise the PP validation results; Web Page: all the results are in the H-saf web-page in the validation section.

35 Next steps Silvia Puca, Emanuela Campione, Corrado DeRosa In collaboration with RMI (Belgium), BFG (Germany), OMSZ (Hungary), UniFe and DPC (Italy), IMWG (Poland), SHMI (Slovakia), ITU TMS (Turkey) Dipartimento della Protezione Civile Italiana Rep 3: collects all the results of the PP Validation activity: It is a rolling document. User Requirement Documents: summarise the PP validation results; Web Page: all the results are in the H-saf web-page in the validation section. THANK YOU!


Download ppt "The Precipitation Product error structure Silvia Puca, Emanuela Campione, Corrado DeRosa In collaboration with RMI (Belgium), BFG (Germany), OMSZ (Hungary),"

Similar presentations


Ads by Google