Total posts : 45366
By definition the efficiency of an AM transmitter is the AC power delivered to the load divided by the DC power into the final amplifier x 100%. The FCC rules limit the DC power input to 100 mW. Assuming that this is a fixed number, then the gains can be had by increasing the efficiency of the amplifier. We could get up to 100 mW. RF power out if we can achieve 100% efficiency. For a transmitter with an efficiency of 30%, over two thirds of the input power is wasted as heat, and if we could get the efficiency up to around 70% we will more than double the power to the antenna. 70% is not unrealistic for a Class C amplifier.
I do not have any specific mods for any transmitter since I have not given this much thought, but I can give some general information. As I or others think about this we can post more information later.
For the test load I used a 28 ohm 2W resistor soldered to the center and shield of a 6 inch length of 50 ohm coax cable with a BNC connector on the end. The coax is too short to cause SWR problems at this frequency. I used 28 ohms because it is what I had in stock and because this is close to the predicted resistance of a properly tuned 3 m antenna with a loading coil and buried ground radials. The resistance and coax length are similar to what I would expect in a situation where the transmitter is mounted at the base of the antenna.
The efficiency is not directly related to the antenna coupling within limits. The power out is and the power in is. If by changing the match if you increase the power out then the power in will also increase. Since efficiency is the quotient of these two powers, it will not change much…just a bit due to possible nonlinear correlation between the power out and power in. I am not saying that a proper match is not important, and it is a factor in efficiency and is key to delivering the power to the load, but there are other things affecting efficiency.
After I ponder this for a bit, I will post more.