Sponsored Post Learn from the experts: Create a successful blog with our brand new courseThe WordPress.com Blog

Are you new to blogging, and do you want step-by-step guidance on how to publish and grow your blog? Learn more about our new Blogging for Beginners course and get 50% off through December 10th.

WordPress.com is excited to announce our newest offering: a course just for beginning bloggers where you’ll learn everything you need to know about blogging from the most trusted experts in the industry. We have helped millions of blogs get up and running, we know what works, and we want you to to know everything we know. This course provides all the fundamental skills and inspiration you need to get your blog started, an interactive community forum, and content updated annually.

By the numbers: Lifetime Performance of World’s First Offshore Wind Farm

Windfarm statistics and performance figures

Watts Up With That?

Decommissioning of world’s first offshore wind farm offers an opportunity to see how industry costs have changed over the past 25 years.

Guest essay by T. A. “Ike” Kiefer

Lifetime Performance of World’s First Offshore Wind Farm

The first offshore windfarm in the world has just been decommissioned and is now being torn down ( http://www.windpoweroffshore.com/article/1427436/dong-begins-vindeby-decommissioning-pictures ). Its lifetime performance specs are illuminating in comparison with recent wind industry data, and alternative generation options.

Decommissioning has started at the 26-year old Vindeby offshore project, one of the world’s first. The 4.95MW Vindeby offshore project was installed in 1991 using 11 Bonus 450kW turbines. It operated 1.5-3.0km off the southern Danish coast.

1991 Vindeby Offshore Wind Farm – Denmark

Years of Operation: 1991-2016 (25)

Capital Cost: 75M Kroner = $13M (1991USD) = $23M (2017USD)

Number of Turbines: 11 @ 450 kW

Lifetime Generation: 243 GWh

Nameplate Capacity: 4.9 MW

Average Power Output:…

View original post 171 more words

Update November 7, 2016 extension to lesson two reflection coefficients

Today I have added my computation of the effective medium for a particular causal wavelet having a 24 Hz dominant frequency. I have posted a graph of the Impedance medium as measured by the well logs and the effective medium computed by Backus Averaging. I have plotted the resulting impedance graphs (well logs) for the original log data and the calculated Bacchus averaged data. I have also plotted the reflection coefficients computed from the finally sampled log measurements and from the finally sampled but smoothed by Backus averaging effective media logs. I have provided the conventional well logs for comparison. I think the case is well on its way to be convincingly made that reflection coefficients are a function of the dominant frequency of the wavelet. Soon I will provide plots of the superposition of the causal wavelets on top of the reflection spikes. I personally think that  these examples bring into question the use of ” broad band” seismic for detail mapping.  This argument will be bolstered later by the discussion of Fresnel zones which are also frequency dependent. I will provide more examples of higher frequency wavelets later.

I would be interested in your comments and discussion on this idea.

Where the Blog is headed

Velocity is so sensitive in imaging these kinds of dips. I have been working on the concept of the reflection coefficient and have added a lot. I am trying to demonstrate that it is frequency dependent. The is due to the “Bacchus Averaging Effective Medium” differences at different dominant frequencies. This is primarily due to the differences in the width of the principal lobe of the wavelet at different dominant frequencies.

I’ve posted a long exposition on velocities, the many different kinds and uses and sources. This is maybe the most important post of all because velocity is at the center of our work as geophysicists and allows us to talk to the geologists.

I’ve explored a little bit of the consequences of using the wrong velocity to migrate seismic data. I’ve demonstrated that steep seismic dip is extremely sensitive to errors in the velocity model. In the lesson on aliasing I’ve given an example of a grossly under migrated salt domes like. This is from the mid-nineteen nineties when we really didn’t know what we were doing. But when you consider the extreme sensitivity of salt on flanks of 60 or 70 more degrees it should be

I have examined the accuracy of the concept that a seismic reflection when it is picked actually follows precisely the well tops.

I will then moved to one of the basic concepts necessary for time-to-depth conversion. I am a great believer in the Faust method of time to depth conversion of time maps where there is well control. And I develop the basic equations for this and have given some examples.

I’m going to develop the equations for NMO and demonstrate that the standard method of correction for it is only approximately correct. The consequences of the residual errors in NMO is loss of high-frequency in the stack. The same happens in migration using straight gray equations. There are consequences for depth migration as well. In general one of my “soapbox” arguments is that seismic data processing continually attenuates the high frequencies unless drug on in measures are taken.

I’m going to proceed from there to the study of how thin is a bed?. I hope to demonstrate that the usual concepts of the Resolveability of zero phase wavelets are significantly modified when the wavelet is minimum phase.



I am working slowly and resolutely, and God willing, I will eventually fulfill all of these goals and others. Please be patient with me.