-
Notifications
You must be signed in to change notification settings - Fork 1.1k
find new turbidity data #101
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I pulled the following data from the SoDa website's TL form. It's fairly similar to the data plotted above (I don't expect an exact match). I also grabbed their tif files and they too look similar. At least pvlib-python isn't screwing something up. As a first step, we could just give some indication of the uncertainty/quality of the data. Maybe some of you knew about this already, but I'd been using this function for awhile and never thought to check. @cwhanse you mentioned in #95 that there are other sources for turbidity data. Do you or other folks at Sandia have experience with them? Maybe we should revisit #71 too.
|
@wholmgren : that file of Linke turbidity values which we distributed with PV_LIB was pulled from SoDa, or from an earlier incarnation which has been rolled into SoDa. Matt is out this week but I'll find out exactly what he did. There are other ways to compute TL from data, but now that I've looked again I don't of other TL databases available. 0.5 units of uncertainty appears typical, see: http://www.sciencedirect.com/science/article/pii/S0038092X0600291X |
See ECMWF has aerosol and water vapor data for the past 10 years from NASA Aeronet and they have a Python web client to their MARS API but you must register. You will need to convert calculate alpha from AOD at different wavelengths, and combine it with water vapor to get Linke turbidity coefficients. |
Maybe there's an API for NASA Aeronet. It looks like a RESTful service, but hacking it might be hard.
Some other querystring keys: long1, long2, lat1, lat2, multiplier, what_map, nachal. formatter, that determine the extents of the map on the data download tool page. There's also a download all link with the following guidelines. This is the url for a There is some meta about all of their data here: |
After making this issue I did some follow up analysis that I forgot to post. The analysis is only for Tucson, AZ, and I make no claims as to the generalizability of the results. That being said... I used 10 years of data from SoDa's McClear service to create a new clear sky climatology. I compared this McClear based climatology to a year of DNI, GHI, and DHI measurements on clear days. I found that the Ineichen model (or at least the pvlib python 0.3.0dev version of it) is similar in accuracy to the McClear based climatology except for the summer, where Ineichen is best. From what I've read, McClear is supposed to be pretty good, but the main reason I used it was that it was relatively easy to just get processed data instead of needing to implement a new model and track down a bunch of different data sources. So, while the turbidity data image looks bad, I'm less inclined to say that it's trash. Thus, I am inclined to close this issue, at least for now. Here's the notebook and html rendering with the full analysis. It's big. I would be happy if you discovered errors in the analysis. |
@mikofski thanks for the links. Do you have any experience with those data sources and can you comment on their relative accuracy? I suspect that we'd find similar results to the above if we used them to create a new climatology data set. Well, at least for hot and dry Arizona. |
I've been doing some modeling for PNM today and thought that I'd add a new figure to this thread for future reference. The main point of adding this here is to better quantify the spread within a few hundred km, but apologies for the repeating colors and the rough code. pnm_solar = {'lat': {3100: 32.866092999999999,
3101: 32.168793000000001,
3102: 35.644624999999998,
3103: 34.832436999999999,
3104: 34.746503999999995,
3105: 32.975503000000003,
3106: 35.168993,
3107: 35.002102700000002,
3108: 34.810000000000002,
3109: 35.283634999999997,
3110: 35.258000000000003,
3112: 35.012121999999998,
3113: 34.985723,
3114: 34.637019000000002,
3115: 35.564269000000003},
'lon': {3100: -105.99988799999998,
3101: -107.75567700000001,
3102: -105.20570600000001,
3103: -106.77180600000001,
3104: -106.655203,
3105: -105.97943100000001,
3106: -106.600049,
3107: -106.6382121,
3108: -106.52200000000001,
3109: -106.81301699999999,
3110: -107.245,
3112: -106.85783600000001,
3113: -106.73328600000001,
3114: -106.706005,
3115: -106.089483},
'name': {3100: 'Alamogordo',
3101: 'Deming',
3102: 'Las Vegas',
3103: 'Los Lunas',
3104: 'Manzano',
3105: 'Otero County',
3106: 'Reeves',
3107: 'Prosperity Energy',
3108: 'Meadow Lake',
3109: 'Sandoval County',
3110: 'Cibola County',
3112: 'Santolina',
3113: 'South Valley',
3114: 'Rio Communities',
3115: 'Santa Fe'}}
pnm_solar_df = pd.DataFrame(pnm_solar)
times = pd.DatetimeIndex(start='2016-01-01', end='2016-12-31', freq='1D')
turbidities = {}
for sysid, system in pnm_solar_df.iterrows():
turbidities[sysid] = pvlib.clearsky.lookup_linke_turbidity(times, system['lat'], system['lon'])
fig, ax = plt.subplots(figsize=(16, 7))
fig.subplots_adjust(right=0.7)
[turbidity.plot(ax=ax, label='{}: {}'.format(sysid, pnm_solar['name'][sysid]))
for sysid, turbidity in sorted(turbidities.items())]
ax.legend(loc=1, bbox_to_anchor=(1.35, 1))
ax.set_ylabel('TL') |
PR #278 provides some functions to calculate turbidity from different aod and pw measurements, so we can close this issue. |
Has anybody ever looked at the default turbidity data? I finally got around to it and I think it's horrible.
First, I was surprised when changing my latitude and longitude by just a small amount gave me fairly different results.
The values for Tucson change by up to 0.5 depending on if I round to the nearest degree. No significant terrain differences between these points.
Then I decided to just plot the full array for every month. Here's the code plus a couple of images.
Sure, it has the global, seasonal trends right, but there is a ton of crap in there.
The text was updated successfully, but these errors were encountered: