So I have an unreasonable problem. I need to create a peak detector that is able to measure voltages between 0.1 and ~200V. I can’t think of anyway (or system) that can do that. The signal is AC and has a frequency of around 2 Khz. As the voltage goes up does does the frequency. I expect the voltage to ramp up from 0 to Level within 3 seconds.
@ Bec a Fuel - Wow, that’s hardcore voltage measurement. Good to know but I don’t think it has the 2Khz response time I’m looking for.
I have an idea. A voltage comparitor outputs a signal proportional to the difference between it’s inputs, but at a voltage that’s a ratio of the supply voltage. Since no comparitor I can find measures 200V, I should be able to use a voltage divider (a pot) to scale the voltage down to within the comparitor’s range. I can then build my peak detector on the comparitors output. Does this make sense?
One typical, simple minded way to detect peaks in an analog signal is with an analog differentiator Differentiator - Wikipedia and a comparator. The differentiator signal goes to zero when the signal goes through a peak and the comparator is set to trigger when the differentiator signal approaches zero. The problem is that calculating a derivative either numerically or with an analog circuit is inherently noisy so you have to do a lot of filtering which will depend on your signal and noise environment details. There are a variety of other clever analog circuits for detecting peaks. If you google “analog peak detector”, you’ll find dozens http://m.eet.com/media/1131484/13370-70705di.pdf . You’re right that most ICs won’t handle 200V straight in, but you should be able to use a voltage divider to scale the signal appropriately.
On the other hand, your dynamic range is only 2000:1 (200 divided by 0.1) or about 66 dB. Most 16 bit ADCs are able to achieve 80 dB or better dynamic range without much effort. 24 bit ADCs are even better and are also readily available. You should be able to just use a resistor voltage divider to get the 200V signal scaled down to the maximum voltage your ADC can handle. For example, if you pick an ADC with a 5VDC maximum input level, you’ll need a 40:1 divider and your minimum signal will be 2.5 mV, well within the range of modern ADCs.
To find peaks accurately you have to take a lot of samples per cycle, at least 10 and 100 isn’t unreasonable. If your input signal is 2 kHz, you’ll need an ADC that can make 20,000 - 200,000 samples per second. I’ve been using ADCs from Linear Technology and Analog Devices recently with good success. TI is also a good source. Almost all of their ADC chips have fully configured evaluation or demo boards for less than $100 that will probably do your job just fine If you go to either of their ADC web pages you can do a search by parameter. Look for ADCs with at least 16 bits of resolution that can take 50,000 to 200,000 samples per second. You probably want to stay away from audio converters and if the ADC you want is described as a sigma delta converter you have to make sure the delay time between when the signal shows up at the input to the ADC and when the output has settled is appropriate for your application. Once the signal is digitized, you can filter and differentiate with code which is a whole lot easier.
@ Gene - I concur on the voltage divider aspect, that makes sense. I realize that Digital Signal Processing would be the all round way to go. It’s inexpensive and easily adjustable. My only concern was if it would be fast enough for the application. I guess I’ll have to do it. STM32F405’s are DSPs at heart right I have 24bit Sigma Delta converters that operate at 96Khz and use the I2C bus. Yes the delay time is an issue (it’s the whole idea for using analogue in the first place). This is good stuff. I’ve got a lot of datasheet research to do.
Timing could certainly be an issue. The simplest calculation I can think of assuming you use an analog prefilter to get rid of most of the noise is to take the latest sample subtract from it the sample a few samples before and if the sign of the difference changes, that would be a peak. So one subtraction and a couple of comparisons. For example, assume 50 kSamples/second and a 16 bit converter. Assume I2C at 400 kHz. That would take 40 microseconds just to clock the data in let alone any setup time for the transfer. if you need to make a decision in 1/4 of a cycle, that would be 125 microseconds so you’d have 85 microseconds to make the calculation. I’m guessing .Net Micro wouldn’t support that in C#. Maybe someone who knows about RLP could comment on whether it would be doable that way. Or you could use a SPI ADC, plenty of those around and you’d save most of the 40 microseconds on the transfer.
I’m starting to line up ICs for the task. Dual Comparitors, 555 timers and a latch IC. I’m thinking of using Dave Jones example below however he stops just short of getting the peak signal outputted to somewhere I can use. So I’m using a 3rd Op Amp to drive the circuit that follows it.