Intellectual Phase Locking – Not Just For Climate Science Anymore!

A rather fascinating “Banned TED Talk”. The Science Delusion – Rupert Sheldrake”

He lists 10 dogmas of Science and then proceeds to assert these are not proven, and may well be wrong in some way.

Then a couple are examined in a bit more depth. In particular, the notion that “G” is a constant and that the speed of light does not vary. Citing evidence that in fact they do vary when you look at the raw data, but that variation is averaged out because “They are a constant”… For light, the metrologists involved were quire proud of the fact that in about 1972 they redefined the meter in terms of light, so assured the variations could not return!

To understand the reference to Intellectual Phase Locking, listen to the part about variation in the constants… At the start of the talk, he addresses a fundamental nature of Science (as open questioning) that is in conflict with Science As World View i.e. dogma. Shades of the Global Warming intolerance for real scientific questioning of their data, methods, and conclusions!

A thought provoking talk, well worth the time.

Subscribe to feed


About E.M.Smith

A technical managerial sort interested in things from Stonehenge to computer science. My present "hot buttons' are the mythology of Climate Change and ancient metrology; but things change...
This entry was posted in AGW Science and Background, Science Bits and tagged , , , . Bookmark the permalink.

30 Responses to Intellectual Phase Locking – Not Just For Climate Science Anymore!

  1. philjourdan says:

    If you question the basic laws (constants), it does open up a whole new line of research and possibilities, but it also then calls into question the framework of science as it exists today.

    It is thought provoking. And it can radically change the questions science is now asking, going back many steps to again start questioning the basics.

  2. E.M.Smith says:

    I find it intriguing that they detected variations in the measured constants and the “answer” was to just average them all… then to redefine the yard stick (literally) for speed of light…

  3. E.M.Smith says:

    Interesting… looks like at least a few folks are asking the question “Is C a constant”…

  4. E.M.Smith says:

    Oh Dear…

    Seems someone has shown that ‘c’ needs to be restated as:

    The speed of light, c, is a constant in a vacuum for a plane wave.

    By making a conical wave, they got c to change a bit…

    For this reason, c is correctly referred to as “the speed of light in a vacuum.” However, in a paper on arXiv, Miles Padgett from the University of Glasgow has shown that even this needs a rethink. He manipulated the wave structure of some photons and sent them on a path of the same length as unaltered packets of light. The manipulated photons arrived later, indicating they were travelling more slowly.

    The manipulation occurred by twisting a plane wave (one where the wave front is a parallel plane at right angles to the direction of travel) into a conical wave front, which is analogous to focusing a wave from a spread-out source onto a single point.
    It’s not the first time that the speed of light in a vacuum has been shown to be flexible. The velocity of light can be reduced within a hollow waveguide, and discrepancies in the time of arrival of light and neutrinos have led to suggestions of unrecognized effects from gravity, albeit on a scale many times smaller than Padgett proposes.

    Professor Robert Boyd of the University of Rochester, New York, told Science News, “I’m not surprised the effect exists. But it’s surprising that the effect is so large and robust.”

    “Our work highlights that, even in free space, the invariance of the speed of light only applies to plane waves,” the authors conclude. It is unclear whether the work has any practical applications. There are no indications to suggest that since light speed can be lowered, it can also be raised or breached for interstellar travel.

  5. E.M.Smith says:

    Oh Dear. G varies a lot too. Not only that, it does it in a 5.9 year cycle that matches length of day periodic cycles. The “explanation” looks pretty darned ad hoc to me, sort of a ‘hand wave’ that maybe what’s changing LOD is making it seem like G is changing… why not some real changes is causing both?

    Why do measurements of the gravitational constant vary so much?
    April 21, 2015 by Lisa Zyga,
    (—Newton’s gravitational constant, G, has been measured about a dozen times over the last 40 years, but the results have varied by much more than would be expected due to random and systematic errors. Now scientists have found that the measured G values oscillate over time like a sine wave with a period of 5.9 years. It’s not G itself that is varying by this much, they propose, but more likely something else is affecting the measurements.

    As a clue to what this “something else” is, the scientists note that the 5.9-year oscillatory period of the measured G values correlates almost perfectly with the 5.9-year oscillatory period of Earth’s rotation rate, as determined by recent Length of Day (LOD) measurements. Although the scientists do not claim to know what causes the G/LOD correlation, they cautiously suggest that the “least unlikely” explanation may involve circulating currents in the Earth’s core. The changing currents may modify Earth’s rotational inertia, affecting LOD, and be accompanied by density variations, affecting G.

    Maybe Jupiter is what done it…

    Possible gravitational link between the 5.9year period in the length of the day of theEarth and the 11.86 year orbit of Jupiteraround the Sun
    C. S. Unnikrishnan ∗
    Gravitation Group, Tata Institute of Fundamental Research,Homi Bhabha Road, Mumbai – 400 005, India

    We note that the recently discovered 5.9 year period in the length of the day of the earth, with amplitude 0.13 ms, matches in period and phase with the earth-Jupiter distance attaining an extremum, at those times when the Jupiter is at its perihelion or aphelion and the Sun and the Earth align along its orbital major axis. Though no physical mechanism is evident, the strong correlation at matched phase is suggestive of one that peaks at the period of these conjunctions. The recent confirmation of a periodic variation of the length of the dayof the earth (represented as ∆ LOD), with period 5.9 years and amplitude of about 0.13 ms [1], has attracted wide interest. Subsequent observation of statistically significant correlation between the ∆ LOD data and different measured values of the gravitational constant G, with variation of 100 ppm,has prompted the speculation that there could be causal link between thet wo, perhaps through the ∆ LOD influencing some aspect of the measurement and has drawn the attention of the gravitation experiment community to the ∆ LOD data [2, 3].

    So either Jupiter is upsetting the measurement or it is upsetting G. Could G vary with MASS? Might that explain some of the issues with galaxies not “rotating right”?

  6. Gary says:

    Morphic resonance sound pretty sci-fi, but it’s been noted that with athletic records, once someone breaks a supposed barrier, several people soon follow. Bannister’s conquering the four-minute mile comes to mind as an example. However, if teleconnection learning is real, then why do so many fail to learn what is widely known already? Or is it only rats in a maze that have the ability?

    As for varying constants, one has to account for the various methods of measurement. Is everybody using the same method and just getting small errors from unknown (unknowable?) sources, or do different methods confirm/contradict the “standard” number in consistent ways?

  7. Interesting heresy…. Back around the turn of the 20th century, Lord Kelvin opined that all the laws of physics had been found and that it was the task of the scientists that followed him to get measurements to more decimal places.

    One thing that should be obvious is that as we get to be able to measure things more accurately, we find the little deviations from the known rules that we hadn’t seen before. Some can be accommodated fairly easily without changing the structure of the laws, but others need a total re-think of the reasons things happen.

    Big G is measured by using some known masses and by seeing how much they attract using a very delicate torsion balance. If you think deeply about it, if gravitational attraction is between all particles of mass (and most likely energy as well) then the shape and size of the test masses relative to the separations used is going to be important. We can’t just say all the mass is effectively at the centre of mass (or centre of gravity) and it all acts in the same straight line from the centre of one mass to the centre of the other. Almost all the particles will be off that line, and thus the attraction will be vectored and the actual net force will be lower than expected. When I’ve got some really free time I’d like to analyse the G experiments and see if they actually do agree with each other when this is taken into account. Seeing the daily measurements and looking for cycles may also be interesting. Averages hide the details, and there may be a lot of information hidden in the lab notes.

    It is also possible that G does actually vary. Some people have measured small anomalies during eclipses, with the idea that one celestial body (the Moon) can shield the Sun to a measurable extent. Even that it can change the inertial effect of that mass. Since these have been measured using clocks that depend on inertia remaining constant (balance-wheels, pendula and quartz crystals will speed up if inertial mass decreases) there is a problem with actually measuring the effect. It needs a different clock, maybe something like an R-C constant (yep, L-C oscillators may also depend on the inertial mass of the electrons, so can’t trust those either).

    With the metre defined in terms of the speed of light, and the second also, we definitely can’t measure any difference in its speed. If it is actually variable, though, and E=Mc², then Conservation of Energy is actually up for contention.

    Though I feel that “morphic field” may be going a bit too far (some echoes of Discworld in that) I can’t dismiss it as rubbish. We do know that animals have inherited characteristics as to what they actually do as well as what they look like. Where’s the memory in the egg that allows a bird to navigate across the world to its summer home and then back to its winter one? Eels go to the Sargasso sea to spawn, and Salmon swim upriver to the source to spawn. In what form is the map of the world stored and the instructions of when and where to go to? Fish eggs are after all often left to fend for themselves, and there’s no morphic field from a parent around. If you take some frogspawn out of the pond and let it hatch in a jam-jar, you still get frogs with the same characteristic responses, mating calls, etc..

    I think the lesson here is simply to not be so certain that the text-books are correct that you miss evidence where they are wrong.

  8. Larry Ledwick says:

    It has always struck me as interesting that the Sun’s average sun spot cycle length is very close to the period of Jupiter’s orbit around the sun. As Tallbloke’s Talkshop various discussions about resonances in orbits suggest we have a classic wheels with in wheels problem here.

    If time / space can be warped by gravity, perhaps gravity can also be warped by other influences / forces or circumstances. Maybe it all just one huge rubber sheet of forces and effects where all are distorted by the others.

  9. p.g.sharrow says:

    It seems that many “constants” in astrophysics/physics are not. They are based on the assumption that conditions that the assumptions are based on are constant. It has been known by real researchers in these fields there are variations in the measures that the “constants” are based on. The universal acceptance of constants in the speed of light and the application of gravity has resulted in many errors in understanding in physics/astrophysics. The present “science” of climate is a good display of errors in judgement based on incorrect assumptions and constants.
    The local density of the fabric of space , Aether, causes change in atomic process. speed of light, gravity. all this has been known for a long time, but is not a part of the “standard model”. Because If these things do not change then there is no change in local density, no need for Aether, therefor it doesn’t exist because we don’t need it. But atomic process speed does change, speed of light does change, rates of gravity does change. Constants are close but not constant in the real Universe!
    That is why I constructed my own “model” to better understand the functions of mass/inertia and gravity. I do require the existence of Aether to make my universe work.
    The speed of Fusion / Fission events does change with local changes in mater/energy density. The speed of light does change with local changes in density. Even local gravity

  10. ossqss says:

    Quite the interesting video EM.

    Also quite interesting that he did it without wearing shoes.

    LOL, what was he trying to convey there?

  11. Keith MacDonald says:

    It looks like the TED board that banned Sheldrake have no idea how correct he is.

    In 1912 Einstein concluded that:
    “Das Prinzip der Konstanz der Lichtgeschwindigkeit kann nur insofern aufrechterhalten werden, als man sich auf für Raum-Zeitliche-Gebiete mit konstantem Gravitationspotential beschränkt.“

    (“The principle of the constancy of the speed of light can be kept only when one restricts oneself to space-time regions of constant gravitational potential.”)

    Max Born stated that both speed and direction of light change in a gravity field.

    Richard Tolman agreed and expressed the radial speed of light as dr/dt in a gravity field.

    My conclusions?
    1) Einstein himself defined the speed of light as a slowly-moving variable
    2) speed of light is only a constant when the gravitational potential doesn’t change. (or)
    3) the speed of light must change if the gravitational potential changes

  12. tom0mason says:

    I have always said that in science there are no real facts (unmovable laws, and constants) — we only have a consensus on what is today’s best estimates that attempt to explain the universe about us. Science is always caught by the limitations of measurements with finite accuracy and precision. Any ‘law’ or ‘constant’ may be updated tomorrow with new observations and new tests. Mathematical statistics rules science, and when correctly applied gives us the best ‘mean’ for any physical parameter within a range of observed variation. Seeking new scientifically/mathematically verifiable methods are what keeps science progressing.

    Science is not a catalogue of known facts but a patchy and incomplete agglomeration of best approximations upon which the majority of scientists can agree.

  13. Larry Ledwick says:

    Yes, a good example is the different values of Pi that have been used over the years in different regions and eras. Each driven by what was “good enough” precision for the tasks they need it for.

    Babylonians started with a value of Pi of: pi = 3
    Around (ca. 1900–1680 BC) they upgraded the precision to a value of 3.125
    Archimedes (287–212 BC) showed that pi is between 3 1/7 and 3 10/71
    Zu Chongzhi (429–501), a brilliant Chinese mathematician independently came up with a value for Pi of 355/113 which in decimal is 3.14159292 very close to the modern value of 3.14159265 only beginning to differ beyond the 7th decimal place.

  14. E.M.Smith says:

    There were several different fractions used for Pi that were chosen based on the degree of math you were willing to do to get the precision desired.
    22 / 7
    333 / 106
    355 / 115
    52163 / 16604

    You get folks asserting that the ancients didn’t know Pi to many digits when in fact they used different fractions to get different precisions.

  15. Larry Ledwick says:

    Like all models, wrong but useful fits here too. Even if you could calculate it exactly you don’t need perfect precision for a constant like Pi to be able to do useful work. For some guy trying to figure out how many hides it will take to make his Yurt a Pi value of 3 and a paced diameter is close enough to get the job done. The precision known and used by a sheep herder trying to figure out how many fence rails are needed to make a circular corral is way different than an engineer to the Pharaoh who is planning some huge structure. Even though both lived in the same time period they would likely have different learned values of Pi.

    Like you say, the followers of Pythagoras would have invested in a lot more computational complexity to do what they wanted to do and their “known value of Pi” would have been much more precise than that used by a carpenter in the small town where they are born.

    For most real world applications used by the average person, a precision to 2 or 3 decimal places is over kill, and for even the most complex cases 5 decimal precision is good enough.

    For rough work the acceleration of gravity can be expressed as just 32 ft sec/ sec and be good enough to find a useful solution. In engineering we seldom bothered going beyond 32.2 ft sec / sec because it was easy to work with, although a more precise value is 32.174 ft sec/sec.
    Did it really matter if the stone hit the ground at 63 ft/sec or 62.153 ft per second?

    So what is the acceleration of gravity if you ask me? It would depend on my perception of your need for precision, I might give all 3 values at different times, or pull down a copy of the CRC tables and get the value to over 20 decimal places if I was going to compute accelerations lasting years for a space probe.

  16. Soronel Haetir says:

    Some of his statements are just wrong. Such as the mechanistic medicine claim at the end (that is that governments don’t fund such research). The _vast_ majority of government funded research is into such mechanistic medicine because frankly every time something else is put to a rigorous test it fails. But that doesn’t mean that no such research is performed on a public dime.

    Some of the rest of it then gets very much into quibbling over definitions (I’m thinking particularly of his matter having no consciousness formulation). Matter itself could (and I believe does) completely lack consciousness while particular clumps of matter could have it as an emergent phenomena. The old “chemistry is more than applied physics and biology is more than applied chemistry”.

    I’m sorry but after listening to his list I’m not interested in going further.

  17. There’s a lot of evidence from brain injuries in specific areas that points to the mind being enclosed within the brain and that states of mind can be induced by stimulating specific areas of the brain. Memory is somewhat more complex, but does appear to be specifically in the brain and chemical. I remember an experiment with maze-running rats where the brain of a rat that had learnt a maze was pulverised and injected into another rat that then could run that maze better – sorry, some time ago and I didn’t store a link. There’s also been some speculation that brains are somewhat like quantum computers, and thus far more capable than analysis as a binary computer would suggest. I still find it amazing how much information is stored in a seed or an egg, though.

    The speed of light is dependent upon the gravitational field, but this is built-in to the equations anyway. The speed of light in the intergalactic void will thus be faster than we measure here, but if we were out there measuring it we’d probably get the same answer because our clocks would be running that bit faster there, too. There’s no good reason to assume that the permittivity and permeability of free space are constants over all space and time, and it’s possible that they would depend on the size of the universe at that point in time. That constancy is however an axiom, and when there’s evidence to the contrary then I expect it to be modified as needed. There’ll probably be a large number of people initially who will resist the dumping of that axiom, but old scientists die and new ones take their place who don’t have the same attachment to those axioms and can look at the evidence more clearly. At some point the balance tips and we have a new paradigm.

    pg – whether you call it spacetime (that can be distorted) or Aether, the structure of space does have properties and isn’t just *nothing*. Einstein didn’t remove the idea of Aether, but just re-named it and specified the properties. By specifying that light moves along a geodesic, which is the equivalent of a straight line but compared to Euclidean geometry isn’t straight, he also redefined the geometry of space. This is however based on what we measure to happen based on light-speed transmission of any data and the variability of the speed of light in a gravitational field. I suspect that however the speed of light is not actually dependent upon the measured net gravitational field, and that at the mid-point between two massive bodies (where the measured gravity is zero) it will in fact be reduced in velocity from proximity to both masses – it’s the gravity-field density that matters, not its net force. It may take a while before that can actually be checked. It may however show up in the GPS satellites where the Moon is overhead – net gravity would say that the GPS clock would run slightly faster, whereas the total field hypothesis would say that the GPS clock slows slightly.

    The more precision we get in our measurements, the more we find that things are not quite what we thought they were.

  18. cdquarles says:

    Agreed, Simon. Consider the index of refraction, where we know that light’s interaction with matter (and since matter and energy are two sides of one actuality) slows it. There was a demonstration of this via media with a very large index of refraction that could be varied. Packets of light were slowed to meters per second velocities, such that you could see the packet’s motion. I’m not sure that one could find it now. I didn’t keep a copy of that demonstration.

    I also saw a report where they did droplet experiments on a vibrating membrane. They did the same kind of slit experiments done a century ago that demonstrated wave-like properties for light. They got similar results, if I am remembering correctly. I did copy that one, but, sadly, it was on an archive drive system that failed. That one, I think, can be found now online.

  19. Soronel Haetir says:

    There are lots of conditions under which light can be slowed compared with c, c is specifically the speed of light when not under any of these conditions. I would only call it news if they could get a demonstrable system showing propagation velocities greater than c.

  20. Chazz says:

    So, before 1972, the speed of light measurement depended on a constant yardstick, now the yardstick length depends on a constant speed of light. I can see where this line of thinking could lead to some problems.

  21. CDQ – your second paragraph seems to correspond to which shows a physical reality of Bohm’s pilot-wave theory. The particle affects the wave, and the wave affects the particle, and the wave also interacts with the environment. To me, this is a more satisfying explanation than wave-functions collapsing when someone measures them, and instead says that things happen whether you watch them or not. I think the slowed/stopped light was done at MIT: .

    Soronel – it seems reasonable that the speed of light we can actually measure on Earth will be slower than the ultimate velocity because there is a significant gravity field. We can see evidence of that in gravity lensing in astronomy. In intergalactic space, therefore, light should be measurably faster than we measure here. At the moment we don’t really have the capability to measure this (since we can’t get out there) but it could have been inferred if we’d had the right kit on something like Voyager. Quite simple to do, too, since we just need an LC oscillator where the capacitor is two plates separated by the vacuum of space and a coil that is coreless so depends on the permeability of the vacuum. Measure the frequency of oscillation, which will be proportional to the speed of light. Maybe such an experiment would find nothing surprising, but it could tell us how the speed of light varies outside the local system.

    AFAIK it’s now possible to measure the rate difference between clocks when one is as little as 10m above the other. It’s an interesting way of measuring gravity by its effect on time. That vacuumed LC oscillator could possibly also be used as a gravimeter, though it may be a little difficult to achieve the same precision as atomic clocks. Still, if it could be made precise enough it could also be used to see if the speed of light if affected by the net gravitational field or the total scalar sum of the counteracting gravitational fields, which would be a useful measurement to make. As you go down a mine, the net gravitational field reduces, so we could check if the rate of time speeds up (because net field reduces) or remains constant (since scalar sum of fields is the same).

  22. E.M.Smith says:


    Very interesting, that oil drop… so Quantum Mechanics Magic can be replaced with pilot wave ripples in the aether….

    Speed of light being faster in free space: we could define it as a nice round number like 300,000 instead of 299…. and let the meter adjust :-0 but might this be why spacecraft seem to have the wrong speed as they get far from Earth, and why galaxies spin is “wrong”? C and G vary w/ distance to galactic core…

    When c and G might be variables and time fluid, what are the constants in your LC oscillator and what variables are being measured….

  23. p.g.sharrow says:

    You guys are reminding me that I really need to do an electrostatic experiment that I have been considering to create artificial gravity. Need to make a well insulated plate and set up my great HV Tesla coil outside to charge it. The pincushion and plasma conduit at it’s center should act as a
    HV Diode. I also created 2 – 40,000v semi-conductor diode strings but I”m not sure they are up to this.
    The operation of a capacitor warps the dielectric between it’s two plates. This warpage is the same as that caused by gravity and the warpage caused in electrostatic linear accelerators. Other experiments published have demonstrated accelerations under electrostatic charging.

    Warpage of the dielectric is the push or pull of the atomic nucleus from it’s position at the center of the atom, or the movement of the center of mass from the center of being. This sets up a stress to acceleration in an attempt to center it’s self within the electron shell that is it’s surface…pg

  24. cdquarles says:

    Thanks, Simon. Those do seem to be what I was thinking of.

    @pg … hmm, sounds like a scaled up electron-atomic force microscope. I have done electron microscopy, but that force microscope was demonstrated well after my pathology lab days. I’d say you should be successful in making a working prototype.

  25. LG says:

    Harold “Sonny” White, head of an advanced propulsion lab called Eagleworks at Johnson Space Center in Houston, has been hard at work toward DARPA’s WARP Capability within 100 year goal.

    Paper :

    This paper will begin with a short review of the Alcubierre warp drive metric and describes how the phenomenon might work based on the original paper. The canonical form of the metric was developed and published in [6] which provided key insight into the field potential and boost for the field which remedied a critical paradox in the original Alcubierre concept of operations. A modified concept of operations based on the canonical form of the metric that remedies the paradox is presented and discussed. The idea of a warp drive in higher dimensional space-time (manifold) will then be briefly considered by comparing the null-like geodesies of the Alcubierre metric to the Chung-Freese metric to illustrate the mathematical role of hyperspace coordinates. The net effect of using a warp drive “technology” coupled with conventional propulsion systems on an exploration mission will be discussed using the nomenclature of early mission planning. Finally, an overview of the warp field interferometer test bed being implemented in the Advanced Propulsion Physics Laboratory: Eagleworks (APPL:E) at the Johnson Space Center will be detailed. While warp field mechanics has not had a “Chicago Pile” moment, the tools necessary to detect a modest instance of the phenomenon are near at hand.

    Warp field mechanics 101. Available from: [accessed May 03 2018].

    Alcubierre’s concept had been considered infeasible because it required far more power than any viable energy source could produce. White re-calculated the Alcubierre concept and proposed that if the warp bubble around a spacecraft were shaped like a torus, it would be much more energy efficient and make the concept feasible. White has stated that “warp travel” has not yet seen a “Chicago Pile-1” experiment, a reference to the very first nuclear reactor, the breakthrough demonstration that paved the way for nuclear power.[4][5][6]

    To investigate the feasibility of a warp drive, White and his team have designed a warp field interferometer test bed to demonstrate warp field phenomena. The experiments are taking place at NASA’s Advanced Propulsion Physics Laboratory (“Eagleworks”) at the Johnson Space Center.[5] White and his team claim that this modified Michelson interferometer will detect distortion of space-time, a warp field effect.[7]

    In 2010, NASA physicist Harold White revealed that he and a team were working on a design for this faster-than-light ship, and this is the most recent design of what such a ship might actually look like. As you can see in the image, the ship rests between two enormous rings, which create the warp bubble.

    Artist Mark Rademaker worked on the project with White. In the release, Rademaker asserts that he spent over 1,600 hours working on the design. The ship is called the IXS Enterprise, and it is meant to fit the concept for a Faster Than Light ship. Mike Okuda also brought input, and designed the Ship’s insignia.

  26. EM – that oil-drop experiment shows that the non-intuitive parts of quantum theory can be explained in an intuitive way, providing you specify spacetime with properties a bit more Aether-like. Still, there are quite a number of different explanations of quantum theory now that mostly give the same answers but use different mechanisms. Wiki has a list of around 14 IIRC.

    How far do you need to be away from a gravitational mass to be in “free space”? Given that gravity itself is an indication that the space isn’t free, and gravity has infinite reach (theoretically, anyway), then free space, like vacuum itself, isn’t achievable. It seems stars can have mutual orbits up in the light-year distance range, so that really gives us some yardstick to set where we regard it as near-enough free space. Voyager isn’t far enough away yet, therefore, and still a bit subject to the Sun.

    For galaxy spin being “wrong”, check Mike McCulloch’s blog at . His explanation not only sounds crazy enough to be true, but also predicts a lot of other things we hadn’t considered connected, such as fly-by anomalies, the EMDrive, and the recent measurements from some very early galaxies. It also removes the need for “dark matter” and “dark energy” which have stubbornly resisted efforts to measure them and probably don’t exist anyway. If you can’t ever measure it, it doesn’t exist in scientific terms…. I’m sure I’ve mentioned Mike here before, but it’s worth another plug like the Bohm oil-drops.

    The LC oscillator effectively measures the speed of light, given that the speed of light is 1/root(EoUo). Since the resonant frequency is 1/2.pi.(root(LC)) and the capacitance C depends on the permittivity Eo and the inductance L on the permeability Uo, then the frequency will precisely follow the speed of light in those conditions. I’d use greek letters, but it’s not easy on the standard keyboard and they may display wrongly on other peoples’ computers anyway. Though you couldn’t use the LC oscillator to measure the speed of light, it’s fine for measuring differences in the speed of light. It will need to be kept at a constant temperature to stop drifts from expansion, and will need to be as far as possible from the spacecraft mass, but otherwise it’s a good comparison.

    pg – at those sorts of voltages the air breakdown is going to be a problem. Best do it when the air is as dry as it gets. I’m not sure that the distortion of the atoms will produce a net force, though. You’ll need to show that by experiment. It may also be useful to set up a cheap laser pointer to shine through the air-gap on the capacitor to see if there’s any change of direction of the light (are you actually warping space?). The main problem I see in getting a reliable result is that at these voltages there will be forces to ground and to the building you’re in, so sorting out what is what may be tricky. In free air there’s also normally a voltage-gradient of around 300V/m so there will in any case be forces on a charged body which will vary with the weather. You can’t tell the difference between a produced force and a variance in gravitational attraction. For measuring movements, don’t forget the method of using a laser reflected from a moving mirror. Automatically gives twice the angle difference on the spot, and the “needle” can be as long as you want. All stuff you know, but maybe a reminder may be helpful.

  27. p.g.sharrow says:

    @Simon; I’m considering a single well insulated plate charged relative to ground. then look for changes in mass/weight, both above and below. I should think that any useful effects must be easy to see or they would not be useful. If results were obvious then they would not be caused by an undetermined fluke. Although a unanticipated fluke might be the best explanation of any results. ;-)
    I think I’ll need at least .1/4 inch of polypropylene with welded joints to enclose the charged plate/disk. Got to consider the wiring as well. Off the top of my head after about 40,000 volts everything gets dicey. This will have to be done outside. Good thing this in California and the rainy season is over…pg

  28. pg – maybe a good idea to go for an insulator with a higher relative permittivity (PE and PP have a relative permittivity of around 2) and have enough thickness to get the dielectric strength high enough. You thus might do better using sheets of glass (relative permittivity around 5-6) or resin-bound glassfibre (around 4-5). The effective thickness of the insulator (relative to a layer of air) is reduced by the ratio of the relative permittivity, so the PP at 1/4″ would be equivalent to 1/8″ of air when you look at the surface field strength, but glass at 1/4* would be equivalent to only around 1/20″ (and you can probably use thinner glass). Maybe use epoxy to stick the sheets together with a wide margin at the edge.

    Since all you’re looking at here is a static voltage, you don’t really need much power. All the power drives is the leakage current, and that’s going to be very low. Rather than using a Tesla coil to charge it, maybe better to use a Van de Graaf generator or similar which won’t need rectification and you can get up to megavolts pretty easily.

  29. Chris in Calgary says:

    I’m reminded of Isaac Newton’s quote:
    To myself I seem to have been only like a boy playing on the sea-shore, and diverting myself in now and then finding a smoother pebble or a prettier shell than ordinary, whilst the great ocean of truth lay all undiscovered before me.

    It’s just as true today — maybe we now have a couple of toes in the water.

    Having said that, the constants don’t vary much over time. If e=mc^2, then a rise in c would cause stars to increase in temperature very quickly and explode if the rise was big enough. Similarly a fall in c could cause certain stars to collapse in on themselves and the rest to shrink and cool. c^2 is a big number and adjusting it up or down significantly would have huge effects.

  30. catweazle666 says:

    “I also created 2 – 40,000v semi-conductor diode strings but I”m not sure they are up to this.”

    Have you considered a Cockcroft-Walton voltage multiplier?

Comments are closed.