3

Consider the picture below is design of half wave length (λ) dipole antenna. I read some explanations saying that it is the best design to get the optimum transmit power. Half wave length (λ) is from tip to the tip of the two elements (you know it).

But my concern here is, what is the minimum or maximum distance of the gap as in the picture? Gap between the two elements which the feeder line is connected? Some say that it should be as small as possible. But unfortunately, I don't have mathematical justification for that reason even it make sense. I expect that the explanation is in λ. But if you really need the used frequency, then just put 2,100 MHz. If needed, the feeder line is RG6 75 ohm. Hal λ dipole antenna

Sitorus
  • 87
  • 3
  • 9

4 Answers4

2

The gap is neglectable in terms of λ. Let's say you are making a 20m (14 MHz) dipole and you decided to use a large gap, let's say 10cm. This is only 0.0005λ. For 2m band (144Mhz) it's 0.005λ. If you choose an even larger gap it means there will be some wires that will connect the arms of the dipole to the feed line. These wires will just work as parts of the arms. Once again - there is no significant gap except the distance between the center of the coax and the shield of the coax.

In other words just connect the arms to the coax the way it's comfortable and then trim the length of the arms to get minimum SWR in the center of the band.

[Richard Fry is the author of the following paragraph and graphic.] Below is a NEC4.2 study of a 40m, 1/2WL, center-fed, free space dipole with a 0.2m gap for an insulator at the feedpoint. NEC sources themselves are applied at a single point on a conductor, but this approach or a variation of it might lead to a reasonable, practical solution for most amateur radio operators and applications.

enter image description here

Richard Fry
  • 2,950
  • 8
  • 19
1

Looking at this sentence,

But unfortunately, I don't have mathematical justification for that reason

Here is a mathematical answer:

The minimum gap is zero and the maximum gap is $\lambda/2$. The mathematically ideal gap is 0, as explained in several books on antenna theory; there is an integral over the dipole length as charge is collected.

Any gap between 0 and $\lambda/2$ will work, but it will work a little "better" if the gap is closer to --or equal to-- 0.

However, with the way you are laying out your dipole, adding a zero gap will cause a short circuit and the dipole will not function. Adding an infinitesimally small gap will allow sparks to jump across the gap, which is also bad. So add a gap to prevent a short circuit and try to make it big enough to prevent sparks. Mathematically, you want this gap to be as small as possible to maximize the aforementioned integral.

If you accidentally make it too short, that is a problem, but an easy solution is to add some insulator to fill in the gap, or just make the gap bigger.

It really is that simple.

Chris K8NVH
  • 1,062
  • 1
  • 10
  • 24
0

Actually, that gap is fixed within cable. That is distance between inner conductor and shield.

Part of the cable from the point where inner conductor exits out of the shield is actually part of the antenna.

So, it does not matter what is distance between antenna wires.

However, minimum distance is important, as if wires are too close, it might cause sparking. It depends of transmitter power.

Dipole antenna attachment to cable

Pedja YT9TP
  • 890
  • 6
  • 13
0

The point where the feedline separates marks the center of the dipole. That length of conductor matters. It radiates, carries current, and effects the behavior of the antenna.

If the conductors feeding the "dipole" wires don't diverge at 90 degrees and are "significantly long", it gets more complicated to calculate, and the NEC software many hams use can help to calculate the impact, and let you model variations.

cmm
  • 276
  • 2
  • 11