(i) Sky images Camera: total sky imager 440A Resolution: pixels Frequency: every 30 s Image format: jpg (ii) CBH
Cloud segmentation: RBR of each image pixel was taken with the help of both clear sky image data and sunshine parameter (SP) Tracking method: cross-correlation method (CCM). The sky images were partitioned into subsets of pixels of equal size of squares Forecasting method: by mapping the cloud shadow onto the ground and considering the average cloud moving velocity (assuming spatial homogeneity of cloud velocity), occlusion time was obtained. The drop occurred in the GHI due to clouds was assumed to be equal to 40% of the clear sky GHI value
Cloud segmentation: the difference between the blue color channel and the red color channel of each image pixel was compared with a threshold Tracking method: Lucas-Kanade optical flow algorithm Forecasting method: using linear regression, the pixel moving velocity was obtained According to the velocity, feature point trajectories were developed, and time taken by the feature points to pass a specific location on the image was obtained
Occlusion signals were generated 30 seconds ahead of time
(i) Sky images Camera: the total sky imager (TSI) Frequency: every 30 s (ii) Pyranometer irradiance measurements
Tracking method: fast cross-correlation algorithm Forecasting method: a linear prediction model was introduced for irradiance forecast based on cloud motion estimations and the previous solar irradiance monitoring data From the motion vectors, the future cloud motion over the location where solar panels reside was estimated. The time series model was defined employing radiation data and the TSI image RBR value change readings concerning the previous step on the selected window ()
1 min and 2 min ahead irradiance forecasts were obtained
(i) Sky images Camera: UCSD sky imager Resolution: pixels Frequency: captured every 30 s (ii) CBH
Cloud segmentation: sky was segmented into three categories: applying the threshold to the RBR channel and comparing the images with a clear sky model Tracking method: CCM to the RBR of two consecutive images as in [10] Forecasting method: the velocity of all clouds was assumed to be homogeneous. Three different values for the clearness index were obtained for three sky conditions to generate irradiance forecasts
5 min, 10 min, and 15 min ahead forecasts were obtained
(i) Sky images Camera: IP security camera which has a 180° Resolution: pixels Frequency: every 10 s Image format: jpeg
Cloud segmentation: machine learning model developed using pixel color components such as hue, saturation, R, G, and B values of each pixel, RBR, RBD, pixel distance from the sun, and the zenith and azimuth angles of the sun Tracking method: dense optical flow algorithm Forecasting method: according to motion vectors, future sun-occluding paths were constructed. Then, the timing and extent of sun shading events were predicted
The timing and extent of sun shading events were predicted
(i) Sky images Camera: wide-angle C-mount camera Resolution: pixels Frequency: every 5 s (ii) Irradiance measurements
Cloud segmentation: RBR method was used Tracking method: Thirion’s Demons algorithm Forecasting method: motion velocity was extracted using a dense vector field of cloud displacement vectors. Occlusions were determined using cloud velocities, and the clear sky index was used to improve short-term forecasts, below 3 min with a Kalman filter
Forecasted continuous irradiance for time intervals of up to 10 min
(i) Sky images Camera: UCSD sky imager Resolution: Frequency: every 30 s
Cloud segmentation: red–blue-ratio (RBR) method Tracking method: variational optical flow (VOF) technique Forecasting method: the VOF forecasts of the binary sky images were transformed to Cartesian coordinates and generated the VOF-based forecast
Cloud trajectory lengths were forecasted for 1 min to 15 min ahead
(i) Sky images from 3 cameras Camera: the total sky imager Resolution: pixels Frequency: every 10 s
Cloud segmentation: a supervised classifier was developed to detect clouds at pixel level Tracking method: cloud block-matching method Forecasting method: using three cameras, onsite CBH was obtained. Regression-based forecasting was done using image features of the clouds with cloud block motion vectors and CBHs
Cloud segmentation: RBR method Tracking method: optical flow algorithm to the feature points in two consecutive binary images Forecasting method: using CBH measurements and zenith angles of the sun, the cloud shadow was mapped onto the ground. According to the shadow movement and plant location, irradiance drop was forecasted
(i) Sky images Camera: the total sky imager Resolution: pixels Frequency: every 30 s (ii) CBH
Cloud segmentation and cloud type classification: RBR method Tracking method: improved Fourier phase correlation method based on affine transform which is corresponding to image-phase-shift-invariance property was utilized Forecasting method: initially, images were undistorted according to the cloud-based height. Then, the blue-sky area was separated, and the clouds were classified. After classifying the clouds, the sky image-irradiance mapping model was developed. Backpropagation neural network (BPNN) and support vector machine (SVM) are adopted for model training to present sky image-irradiance mapping