Found a new useless electronics project to waste time in 2022. Sinewave leveling to 0.3 dB flatness. Likely won't be successful - $10,000 oscilloscope calibrators exist for a reason. Even verifying its flatness is a challenge. But hopefully I'll ended up knowing more than when I started...

The variable RF attenuator turned out to be a more difficult problem. Ideally it needs to work to DC (50 kHz is often used as a reference signal so you can calibrate the calibrator accurately), but it also has to support 24+ dBm of input power. There is almost no suitable IC for that. Perhaps the solution is to solve the problem in reverse - not to attenuate a strong signal but to amplifier a weak signal. Then distortion becomes the new problem...

It turns out, it's really valuable for this application to target a signal generator with AM input that supports DC, so you only need to implement a power detector and an error amplifier, amplitude can be controlled at the source. Otherwise, you need to do some stupid signal conditioning with your own variable gain amplifiers and attenuators... At this point you have basically already created a convoluted version of a signal generator's entire output stage. And at this point you may as well put a DDS synthesizer into it and convert it to complete full signal generator...

When you see the circuit you are working on in a paper hosted at LIGO.org, you know you are messing with something you're not qualified to do...

The high power is not just a problem for attenuaton, also RF amplifiers. I found The only type of high power, 1 GHz monolithic RF amplifiers that covers both RF and DC is a distributed microwave amplifier, which is extremely expensive (they can work at 6, 10, or even 20 GHz) and definitely overkill. The output also requires a user-supplied bias-T - true DC operation is still not practical.

A two-path design looks sensible now. Ideally if the power meter is at the output side, the frequency response in the entire input chain should not matter.

I just realized I miscalculated the power requirements due to Vp and Vpp confusion :blobfacepalm: . The actual power level I need is only 18 dBm, not 24 dBm. More margin, and even brings the power within the absolute maximum of the variable gain attenuator candidate. But still not enough to solve the attenuator P1dB issue in a "leveling down" design, still need amplifiers to "leveling up".

IDT/Renesas has a high power RF variable attenuator from 1 MHz to a few GHz. A two-path design looks ideal here, the attenuator does 95% of the job, removing the need for an expensive and error-prone RF gain block, a simple opamp handles the rest of the job in the DC-LF range. The only nuisance is two independent Automatic Level Control loops are needed.

Hmmm, modern RF power meters are quite accurate. The measurement uncertainty is around 0.05 dB within +20 to 25 degrees C. Renting a calibrated meter for a few weeks can be the test plan, but I need to be really careful... Blowing the meter up is equivalent to burning $2000.

For these kinds of measurement devices, what people are really paying for is not the hardware, not the software, not the technology or perhaps not even the service. It's all about a single piece of paper called a NIST-traceable calibration certificate...

The legendary LT1088 thermal RMS-to-DC converter chip was like 1% accurate to 50 MHz and 2% to 100 MHz, that is 0.2 dB assuming ideal match. Even today it's still a really accurate RF power meter. If I can get an LT1088 and shows it agrees with my leveled sinewave generator to 1% that would be the end of the story, at least to 100 MHz.

Too bad this chip never sold well and Linear discontinued it decades ago.

Nope. The IDT/Renesas attenuator's 1-dB compression point is also too low below 50 MHz. Still have to go back to a 2-path LF/RF design...

Block diagram of the proposed leveled sinewave generator. Basically just an ALC/AGC loop found in all signal generators and radios, but must be accurate enough within a fraction of a decibel. The two-resistor splitter looks like a joke but it's really how an RF ALC loop works.

The log amp actually requires some calibrations to reach the accuracy needed in my leveled sinewave generator design. And what do you need for the calibration to begin with? A leveled sinewave generator...

I think a VNA can solve this problem. The beauty of a Vector Network Analyzer is that you don't need a perfect signal generator and signal path, it measures and removes all errors by itself via two-port calibration.

Finally decided the architecture of the RF gain stage. After much flip-flopping between an one-path or two-path design, ultimately I found no MMIC simultaneously satisfies the requirements of high output power (5.5 Vpp, or 19 dBm, this basically means the majority of MMICs, the standard 5 V parts, are simply not an option) and DC-to-GHz operation (50 MHz and below are not covered by suitable MMICs). A separate Low/High Frequency path is inevitable.

Oh no, the PHA-202+ MMIC produces 4.5 watts of heat from a tiny 6 mm x 5 mm QFN package. The circuit board is gonna be a frying pan and it would also change the operating point of the RF power detector, this is not going well...

This layout sucks. But the MMIC (U8) gets extremely hot in operation, the sensitive detector need to stay as far from it as possible. I'm not even sure if it's enough. I need to get one of those MLX90640 IR imagery sensors...

Evil RF circuit board hack - When you can't get the ideal impedance for the given stackup, just remove copper in layer 2 and force the RF signal to use layer 3 as the reference plane.

First logarithmic converter RF power meter prototype, to be used in the feedback loop of the leveling sine wave generator. Performance is underwhelming. There's up to 17% variation in frequency flatness in the uncalibrated frequency response.

But I cannot be sure whether it's really coming from the RFIC. My radio analyzer itself is out of calibration. Time to get an expensive RF power meter to find out... 17% looks like a lot but it's just 0.7 dB, not really bad, but not good enough for me.

Ah, I think I've found a cheap alternative to the RF power meter to solve this problem - crystal detectors. HP used to sell RF diodes with guaranteed specs as coax plug-ins for simple RF power control or demodulation applications. Unlike a specifically designed power sensor, these bare-metal diodes don't have any accuracy guarantees or calibration. But absolute accuracy doesn't matter, I only need relative flatness, which is just 0.25 dB for the HP 423A/B. That's great for my purpose.

Last time I evaluated my ADI log converter board and found its frequency unflatness was up to 17%. But I couldn't be sure whether it was really coming from the RFIC since my radio analyzer itself is out of calibration.

The HP^H^H Agilent diode detector bespeaks the truth. It measured a similar response. The variation is up to 60%, but it's power not voltage. Taking the square root gives 26%, close enough. My PCB and the ADI chip have acquitted on all counts...

More tests today with the help of a new RF power amplifier. The performance looks ugly, to be expected, a PA naturally has much worse output matching and distortion than a lab signal synthesizer... Unlike the smooth variation from the old test, now the measured response has strange jumps everywhere and hard to make any conclusion. I think I really need to pay the money and get a thermistor-based RF power sensor. A heating resistor can never be fooled. But surprisingly, worst-case variation is still 17%, not too bad.

The first attempt of my RF detector board was a total disaster, return loss at 1 GHz was only 10 dB. The coax-to-microstrip transitions went horribly wrong. Using the new knowledge gained from experimental data, the new revision flips all connectors to the bottom ground side and uses aggressive ground plane cutouts. Hopefully this one will work better.

New boards received and tested. After my """optimizations""", now the performance is even worse! :blobfacepalm: Return loss is still 10 dB at 1 GHz, but now the 20 dB point has degraded from 500 MHz to 300 MHz. All that tweak did was making the transition more inductive at low frequency, while doing nothing to stop the capacitance at 1 GHz, even with the ground plane almost completely removed.

Now I have to wonder if using a BNC connector at 1 GHz is inherently a bad idea (even if it's meant to be plugged into a BNC system). The cheap connector manufacturer claims 3 GHz range and 1.3 VSWR max. But I'm not sure how competent it is... Instead of wasting time on tweaking this transition, perhaps a better idea is using a proper microwave connector, permanently slap an BNC adapter on it and call it a day...

scikit-rf has built-in routines for converting a frequency domain response measured on a VNA to its time domain response on a TDR using Inverse Fourier Transform, nice. :blobcatsip:

How to kludge your way from 300 MHz to 1 GHz with copper tape frequency compensation. Quick and dirty, but it works!

First milestone of this project. An early test suggests my power measuring circuitry has probably reached the 0.3 dB flatness design goal from 50 MHz to 1 GHz. Flatness below 200 MHz is even better, no more than 0.1 dB.

This design still has some serious problems. The leveled output is based on average RF power, but oscilloscope calibration uses peak voltage. In the ideal world it's okay, but in reality even a small distortion in the sinewave, like 40 dBc, creates a voltage error up to 1%. Not to mention the challenge of achieving this in a 26 dBm power amplifier from DC to 1 GHz... Typically not better than 20 dBc, which means an 10% error and this is completely unacceptable.

An alternative solution is abolishing the TrueRMS logarithmic converter altogether, and uses an envelope detector (peak detector) instead, just like how it was done in all Tektronix calibrators. This will relax the distortion requirements, but it means everything must be retested. The test procedure is a huge headache as well - it's easy to find a lot of calibrated RF test gears for TrueRMS / average power measurements, but peak-responding ones are not as common...

Or perhaps I can determine the crest factor of the waveform and blindly apply a simple correction in software? This cure is not really better than the disease, it raises even more problems about its accuracy...

Time to rerun the test with the new board using a proper microwave power meter... Then I found the GPIB port on the ancient HP 436A RF meter is incompatible... Mechanically.

A screw just happens to be at the right place, blocking the metal shield of my cheap and possibly out-of-spec GPIB cable, and that screw is too rusty to remove. :blobcatknife:

My temporary workaround is using its analog interface instead and digitize it on the oscilloscope. A quite convoluted setup, at this point almost everything in this room in plugged into the computer. But at least it's fun to stare at the trace jumping up and down on the scope like a cat... :blobcatsip:

The RF power in my auto test setup is basically controlled with this quick and dirty "algorithm"...

while True:
delta = target - actual
if delta < 0.1:
break
rfgen.output += delta

Funny to see it demonstrated all kinds of classic, textbook-level control loop instability problems. It bounces up and down before eventually converges to the target, the overshoot is up to 1.5 dB, and crashes when it's above the output limit of the signal generator. Sometimes it can also enter an infinite loop, jumping up and down endlessly around the correct target. :blobcatgiggle:

New test results of the power detector for my leveled sinewave generator, using a proper microwave power meter - data looks much better now, circuit still needs improvements, but the amplitude flatness is now a reasonable [2.5%, -4%] around the 50 MHz reference.

The previous test which showed nearly 10% variation was more of a test of the RF detector in my out-of-calibrated spectrum analyzer than a test of my actual circuit! :blobcatgiggle:

For increased measurement confidence, it's good idea to compare the readings by the HP 8484A microwave power meter against the HP^H^H Agilent 33330B diode RF detector.

Two sensors closely matched to ~1% from 100 MHz to 1 GHz, great, it means both sensors are working and none is damaged. But below 100 MHz, the diode detector shows a serious anomaly. It's probably not a fault - at low frequency, the built-in bypass capacitance inside the detector is too low to filter the strong RF ripples, which causes erroneous readings.

I found the RF power sampled by the oscilloscope at the "recorder" output does not visually match the meter's seven-segment display itself.

I cross-checked the output with a voltmeter. Voltmeter and power meter's display agree to 0.01 dB. Meanwhile the oscilloscope is off by 0.1 dB, an order of magnitude worse.

Who would win?

* 47-years-old discrete precision ADC built with individual transistors, opamps and comparators in a 1975 meter.

* A typical 8-bit high-speed ADC in a 2010s oscilloscope.

Of course, a generic high-speed ADC design can never compete with a DC precision one, even if one is from 2010 and another from 1975... :blobcatgiggle:

By official rules, the oscilloscope as a tool just can't be trusted for precision measurement at either DC or RF. The DC accuracy is only guaranteed to be 3% or so, for good reasons. These high-speed 8-bit ADCs are not accurate. But at RF it's much worse, neither accuracy or flatness is guaranteed at all (besides that lonely 3 dB point)... Vendors don't even suggest any "typical" frequency response, although by experience we know the "typical" performance can be pretty good, but it's rarely guaranteed.

At this point, I'm seriously suspecting the vendors are just exploiting this historical convention to lower their standards. "If not guaranteeing anything is the industry norm, why should we make our jobs harder?"

The only exception is vendors with a background of computer automation and digitizers. National Instruments specifies frequency response for all their products - yeah, if all the customers are working with raw signals, they will ask for a performance guarantee...

That being said, I have to say I'm seriously surprised by the actual frequency response of my 100 MHz, oscilloscope. At 50 MHz, the RF power meter and the scope agree within 0.44% in amplitude... Just WTF?! This SHOULD NOT happen...

The power meter and the scope probably has a compensatory error,.. Both are wrong in the same way, giving totally unrealistic accuracy...

Oh no, my clueless attempt to build a sinewave calibrator has escalated to level of advanced metrology I totally don't understand. The only "tutorials" I can find are all from the National Metrology Institute in Switzerland...

For every RF calibration problem I have in this project, I can find answers in METAS' conference talk tutorials. Switzerland's national lab looks pretty strong at RF metrology.

This looks like a cool dual resistor to use in a power meter calibration fixture, 0.05% ratio tolerance. But I wonder how much does it cost... It's not on the Vishay public catalog and the selling blurb mentions "aerospace", so I guess a single one costs $100 and takes a 6-month lead time because they're all custom made... :blobcatgiggle:

Follow

Directional coupler arrived. Time to try this output return loss measurement technique from Switzerland and see whether it works. This Mini-Circuits coupler is also older than me, from 1991...

· · Web · 1 · 1 · 4

First quick test. Successfully observed +/- 0.2 dB incident power ripple due to constructive and destructive interference under impedance mismatch, just like what the textbook says. As a rough estimation, Γl = 0.333, so calculated Γg = 0.071, the signal generator's output return loss is around 23 dB. The next step is to redo the experiment seriously with full error correction.

Experiment has finally succeeded. After failing to understand the math and messing with it for a week, I just replicated the "coupler + power measurement + solving equations" technique from METAS, Switzerland for determining the signal generator's equivalent source mismatch.

Using an attenuator attached to the signal generator for verification, the data obtained using this method has strong correlation and reasonable agreement (aside from a few outliers) with the direct VNA measurements.

Another thing I realized while doing this experiment - if your script needs to process data from three or more CSV files, don't do that, import them to SQLite first.

My script was getting unmanageable with tons of loops over arrays just to extract a data point at a given frequency. Only if there's something that handle the data automatically... It's called a database.

Fixed one data anomaly - the VNA data was invalid because this signal generator uses a reflective switch to disconnect the output in RF off mode, causing an extreme mismatch. The VNA trace must be measured with RF on (need to be careful not to blow up the VNA...). Now the ripple in the VNA trace is gone.

Sign in to participate in the conversation
Cybrespace

cybrespace: the social hub of the information superhighway jack in to the mastodon fediverse today and surf the dataflow through our cybrepunk, slightly glitchy web portal support us on patreon or liberapay!