overview

Advanced

Interview with Astronomer Tom Van Flandern

Posted by archive 

source: [www.megasociety.net]

Christopher Michael Langan: Recent, still-accumulating evidence involving type Ia supernovae seems to indicate that the universe is expanding at an accelerating rate. Does this fit into your “Meta Model”?



Tom Van Flandern: If one uses the Big Bang redshift-distance law, then the data implies that the “velocity” producing the redshift is “accelerating”.



But if one uses the Meta Model redshift-distance law (DM2, p. 95), then the energy loss is simply linear with distance. No acceleration is required.



The illusion of acceleration arises in the Big Bang’s redshift-distance law, which implies that the whole universe out to infinite redshift lies within a finite distance. In the Meta Model, redshift is linear with distance to infinity.



CML: Immanuel Velikovsky (“Worlds in Collision”), who propounded certain ideas similar to your own regarding the origin of the asteroids, has been widely reviled as a “pseudoscientist” and is despised by many planetologists and scientists in general. Have you been subjected to pejorative remarks on the basis of this resemblance? Would you resent evenhanded comparisons with Velikovsky, given your own research on the exploding planet hypothesis? What is your opinion regarding Velikovsky’s notorious “Venus is a comet expelled by Jupiter” hypothesis?



TVF: The late Bill Kaula used to teach his students at UCLA about the errors in the ways of the “four Vs”: Velikovsky, von Daniken, Vatsushevski, and Van Flandern.



When I first read Velikovsky’s work, I had two main reactions: (1) The man’s writing was sensible and logical, by contrast with his reputation, which ranged from incredibly naïve and ignorant to mad. (2) The man’s errors were no more grievous than one finds in any issue of the Astrophysical Journal. I blame my colleagues for suppressing freedom of expression by boycotting Velikovsky’s book, rather than the author for making provocative speculations, some of which turned out wrong.



Over 20 years ago, I attended a meeting of dynamical astronomers in which Yeomans presented results of a numerical integration of Comet Halley back through history. He noted that the integration was forced to end about 1450 B.C. because the comet passed so close to Earth that it was impossible to be sure which side it passed on, and therefore impossible to take the gravitational perturbations back further. In the question period following his talk, I asked Yeomans if that close passage of the comet, certain to have been a spectacular event at the time, might have been the origin of some of the legends that Velikovsky was translating. You could have heard a pin drop. The embarrassed speaker ducked the question and moved on. A vacuum formed around me in the room.



In short, my view is that Velikovsky’s literal scenario involving Venus has been “proved” impossible (in the mathematical sense, using numerical integrations with variational equations), assuming we know about all relevant solar system masses (and probably even if we don’t). However, the writings he struggled to translate probably do record real celestial and terrestrial events. I applaud his pioneering efforts to understand their meanings.



CML: You make a very interesting and seldom-heard point: General Relativity is not locked into the formalism of differential geometry, the mathematical language in terms of which it was originally formulated. You cite at least one possible alternative: "Eddington (1920, p.109) was already aware of the mostly equivalent "refracting medium" explanation for GR features, which retains Euclidean space and time in the same mathematical formalism. In essence, the bending of light, gravitational redshift, Mercury perihelion advance, and radar time delay can all be (viewed as) consequences of

electromagnetic wave motion through an underlying refracting medium that is made denser in proportion to the nearness of a source of gravity. (Van Flandern, 1993, pp. 62-67 and Van Flandern, 1994)" However, the relativistic conception of spacetime, based on a repudiation of the luminiferous ether, seems to deny the existence of any such medium. Are you aware of any models supporting this approach?



TVF: Yes, I am aware of several attempts at models of GR phenomena in flat space-time. See, for example, Gen.Rel. & Grav. 2 #4, 347-357 (1971) by Fernando de Felice, “On the gravitational field acting as an optical medium”. He points out that Einstein himself first suggested the idea that gravitation is equivalent to an optical medium. From the abstract, “… Maxwell’s equations may be written as if they were valid in a flat space-time in which there is an optical medium … this medium turns out to be equivalent to the gravitational field. … we find that the language of classical optics for the ‘equivalent medium’ is as suitable as that of Riemannian geometry.” Nine earlier authors who have studied this similarity are cited in the paper.



Because I found this equivalence of interest, I also have developed a model that meets the same criteria, although from a quite different starting point. DM2 describes the cosmology (the “Meta Model”) that serves as a context for this new model of gravitation, and begins the gravity model development. Some of this gravity model also appears in “Possible New Properties of Gravity” at [metaresearch.org], “Cosmology” tab, “Gravity” sub-tab. The model is carried further in two papers in the Meta Research Bulletin which concentrate on the Mercury perihelion advance prediction. And a new paper containing the most complete account of this model yet will appear as a chapter in a new, multi-author book due out within a few months.



CML: We’ve already ordered “DM2” (Dark Matter, Missing Planets and New Comets) with the highest expectations. Does the new book have a title yet?



TVF: Pushing Gravity: new perspectives of Le Sage’s theory of gravitation, M.R. Edwards, ed., Apeiron, Montreal (2002).



CML: As you know, you're one of the few qualified, highly credentialed scientists willing to give a fair hearing to theories that would be considered offbeat or at least off-center by the mainstream. This implies that you must be contacted by many theorists unable to obtain what they consider a fair hearing for their ideas from the customary sources. How often do these ideas turn out to be worthwhile? Do worthwhile ideas ever come from non-academics?



TVF: The surmise is correct. I get more calls, mail, and email than I can manage. It has forced me to develop “quick look” criteria for evaluating the probability that a particular contribution is worth spending any further time on. Indeed, 25 years ago, I had to take a hard look at my own work and ask myself “How dare I be so arrogant as to oppose the overwhelming opinion of world experts?” My multi-year search for answers forced me to re-discover the rules of Scientific Method. (The testing part is trivial. The real essence of the method is controls against bias, lest all data and experiments be interpreted in ad hoc ways as consistent with prior beliefs. Mainstream science seems to have forgotten that part, and now has developed vested interests in not re-discovering it themselves.) I also learned the huge importance of communication skills – both for the obvious purpose of getting feedback from colleagues and of teaching those interested; but also as a form of self-examination about how well one really understands the fundamentals of a topic.



“How often ideas turn out to be worthwhile” is difficult to quantify, but I’d say often enough to make pursuing off-beat ideas worth any knowledge seeker’s professional time because, even if 90+ percent of them are dead ends, one learns so much in the pursuit, and the occasional idea that pans out is usually worth enough to more than compensate the time invested in finding it. Most definitely, many great ideas come from non-academics. In part, this is probably because many potentially great academics (I know a few personally) have been turned out of academic institutions or discouraged from entering them professionally because they pursued a politically incorrect idea or tried to follow data to the logical conclusion without regard to whose toes that stepped on. Today’s academia is much less tolerant of dissent than that of 20-40 years ago.



CML: As pickings grow slimmer due to declining enrollment and rising competition from an ever-larger number of advanced degrees, as tenure fades into history and the shadows of orthodoxy and political correctness loom ever larger over the ivory tower, the talented maverick finds scant appreciation in academe. Is this just part of a cycle, or is there really no light at the end of the tunnel? Is there anywhere that talented non-academics can take their good ideas, given that many of the most respectable scientific periodicals are so biased in favor of academia and academic credentials that they amount to the in-house journals of exclusive clubs?



TVF: As orthodoxy closes ranks and the number of talented, excluded people grows, the outsiders inevitably organize and start movements, new journals, etc. The internet has greatly sped up this process, and caused the mainstream to lose some of its tight control over communications and funding. I suspect I’m not alone in my decision to go forward seeking more answers in my lifetime, rather than to spend huge blocks of time battling the mainstream to accept a few of them. I now see the mainstream of my field as increasingly irrelevant. I see the path forward as accelerating that irrelevance rather than fighting the resistance. (I do, however, continue to value whatever valid criticisms my colleagues offer to any idea.) Eventually, the “private club” aspect of mainstream science will become obvious to all, and it will lose public support.



I may or may not live to see that happen. But I hope to weaken my share of the mainstream’s support pillars until my colleagues return to strict adherence to scientific method, following which the replacement of models where merited will happen quickly.



The other side of the coin is that most people who contact Meta Research have little self-perspective, are not short on ego, and often bring a super-confidence in the correctness of their own ideas that makes evaluating them for the record a dicey proposition. I often tell novices that they are not ready for serious discoveries until they have had at least three of their greatest inspirations shot down by their own arrows through diligent research and controlled testing. Indeed, it has been a humbling experience for me to come to realize that most ideas that do work out have been proposed in the past, but did not catch on.



CML: As the empirical arguments of physical and cosmological theories become less accessible, essential experimental equipment grows more expensive and provides less “bang for the buck”. Meanwhile, the relevant mathematics grows more arcane and less tractable. Under these circumstances, is it really possible for researchers to test their own theories? Are you talking about some kind of rational testing procedure?



TVF: One approach is the one I have used with the exploded planet hypothesis. I make predictions with it, and the mainstream ends up confirming prediction after prediction as it makes new discoveries, all the while trying to patch their own models to keep up. People without vested interests stay blind only for so long. For example, last month, for the third year in a row, our eph-based predictions for the times and rates of Leonid meteor storms were judged the best by experts. All the mainstream models except the one that most closely imitates ours have now been judged failures.



CML: You write, "So the big bang postulates that the cosmological expansion occurs, not because galaxies move apart through space, but because more space is being continually added between them. This continual creation of space ex nihilo is an integral part of the theory. Without it, the cosmological principle would be violated." This seems to imply that self-containment can only be achieved through the ex nihilo creation of

space within the universe. Is this correct?



TVF: I do not fully understand this question, especially the phrase “self-containment” in this context. My quoted words were intended to mean that, without the continual creation of new space everywhere, the Big Bang universe would have a center and an edge with us near the center, in violation of the cosmological principle that the universe should look the same from everywhere within it and that we should occupy no special place. Of course, Big Bang proponents are generally unconcerned about violating the “perfect cosmological principle” because both the universe’s origin and our existence are at special epochs in time, which makes the time dimension special despite its supposed similarity to spatial dimensions.



CML: Please pardon the ambiguity. In a cosmological context, “self-containment” is generally understood to mean that explanations should be intrinsic to observable reality and free of appeals to external spaces or media. For example, General Relativity appeals to nothing external, describing the geometry of spacetime using functions of the matter it contains. In contrast, modern cosmology is replete with multiverses and meta-inflationary many-bubbled hyperuniverses with respect to which observable reality plays a merely secondary role. Given your own ideas regarding the intrinsic definition of scale, is your ontological explanation of space and time intrinsic to observable reality (or is something more assumed)?



TVF: In the Meta Model, insofar as possible, nothing is assumed. The starting point is a complete void with no direct or implicit properties. Using deductive reasoning, we eventually see emerging concepts that resemble those we call “space” and “time”, yet subtly different from our usual understanding of those terms. The important point is that the concepts emerge from logical deductions from a seemingly safe starting point, rather than being themselves assumptions or being products of induction (which is intrinsically non-unique in physics).



For me, the most amazing result from this deductive path is a new meaning of “existence” that carries with it an answer to the age-old mystery of the “origin” of the universe. Seemingly empty space must actually be occupied at some infinitesimal level or it could not exist. “Time” becomes synonymous with “change” to the extent that if a universe had no change, it could have no time. And nothing that exists can ever pass out of existence (nor, by extension, come into existence), but only change form – assemble into larger bodies of disassemble into tiny ones, but with every bit accounted for.



So I would answer “yes”, these properties of a universe are intrinsic to observable reality, and not simply overlays on it.



My own position is that the “no creation ex nihilo” principle, much like the causality principle, “the finite cannot become infinite” principle, and several other principles of physics, are logical requirements of any description of reality. I consider it a logical error to impute violations of any of these to nature based on their arising in equations that attempt to describe nature. I explain my position in: “Physics has its principles”, Redshift and Gravitation in a Relativistic Universe, Konrad Rudnicki, ed., Apeiron, Montreal, 95-108 (2001).





CML: You seem to appreciate that many cosmological hypotheses, even those taken

for granted by the mainstream, are dependent on the models of those who propose them and therefore open to question on a fundamental level. Yet, all these models have certain things in common, e.g. quantum mechanics, some form of relativity and even model theory. But the first two ingredients have distinctive logical structures, and model theory is nothing less than a branch of modern logic. Are you aware of any theories that utilize model theory and other fields of logic to deduce necessary truths about the content of those theories? While the validity of such an approach would

seem to be guaranteed by the fact that all rational theories are logical, would this not amount to asserting that reality possesses, at least in part, a rational (as opposed to an empirical) basis...that it originates at least partially in the mind?



TVF: While I am unfamiliar with “model theory”, as I understand it from the question’s context, the cosmology and other models I reported in DM2 are of this type – deductive derivations from logical first principles instead of inductive guesses starting from observation or experiment.





CML: Again, my apologies. In the branch of advanced logic dealing with formalized theories, “model theory” deals with the necessity of providing theories with correct real-world interpretations, i.e. valid “maps” into the real world (along with the associated levels of reference, the overall properties of theories, the interpretation of variables, relations, function and operator symbols and so on). Such a correct interpretation is called a “model” of the theory. Because any scientific theory requires a model, the purely mathematical constraints imposed by this requirement, e.g. consistency and closure (explanatory self-containment), have a weight at least equal to that of observation. In other words, theories have a priori components that may determine certain aspects of content. To what extent is this the approach you’ve taken with the Meta Model?



TVF: I am unfamiliar with model theory, and therefore with thinking in those terms. For me, the strength of the Meta Model is its lack of assumptions (starting with nothing), yet purely deductive reasoning (no inductive steps or helper hypotheses). To the extent that the logic is valid, it should then be internally consistent and closed.



However, in chapter 20 of DM2 I present some of my own thoughts about “reality” and “truth”. I conclude there that we are logically compelled to adopt the position that a unique, objective, external reality (our common shared experience) exists. Although we cannot prove that it does, we can show that our senses and minds do occasionally lead us into error, but the shared objective reality assumption has not yet done so in any obvious way. It would therefore be illogical to adopt the process known to lead to occasional error (that reality is partially in our minds) over the one for which no flaw has yet been demonstrated (reality is external).





CML: Suppose that we define “objective reality” as that which is external to the observer, and “subjective reality” as that which is intrinsic to the observer. Then some would maintain that the “reality is external” premise has already been shown to be seriously flawed, inasmuch as it seems to deny or at least overlook the fact that reality has a subjective aspect. After all, while physical reductionism can certainly help us explore the physical correlates of things like consciousness, intentionality, emotions and qualia (pure qualities such as colors, which we can objectively compare only up to descriptive isomorphism), it can never really explain the fact that we experience them as we do. As a philosopher might put it, one cannot explain scientific observations without explaining certain essential ingredients of such observations, namely observers. While we eagerly await the delivery of your book, can you explain how subjective experience fits into the Meta Model?



TVF: How is it a fact that reality has a subjective aspect? Or does this question mean only that some observers err in their perceptions of the objective reality? The “fact” that reality has a subjective aspect seems to be introduced as a supposition in the first sentence above, but then treated as more than a supposition. I see no compelling reason to accept the existence of “subjective reality” as a valid premise if it is to be placed on the same level as objective reality, and one reason (the one at the end of my previous paragraph above) for assuming otherwise.



At present, subjective reality has no place in the Meta Model. How is it supposed to differ from dreams or hallucinations? The Star Trek “holodeck” experience is a good example of what I imagine someone might mean by a subjective reality. Yet it deceives the observer about the nature of the shared objective external reality that produces the holodeck illusions. It is not obliged to obey the same laws as a true objective reality. In the latter, if you kick a rock hard enough, it will hurt. Not necessarily so in the former.



CML: It is often claimed that most physicists and cosmologists now subscribe to

the Many Worlds interpretation of quantum mechanics. This interpretation

employs what amounts to an exhaustion algorithm to get around the

measurement problem. But as far as Occam's Razor is concerned, it is so far

"off the deep end" as to be laughable. Despite having initially supported

it, even Hugh Everett's mentor John Wheeler eventually dropped it due to the

sheer weight of its "metaphysical heavy baggage". Is the measurement

problem really so intractable, and so odious, as to rob otherwise-sane

scientists of all respect for theoretic economy?



TVF: My thoughts on quantum physics appear in chapter 5 of DM2. The deductive cosmology of chapter 1 insists that scale and matter are infinitely divisible on the small scale (as well as producing infinite assemblages on the large scale). This leads to a new interpretation of the quantum mechanics experiments that is striking for its lack of paradoxes. However, the real key to the road out of quantum madness is the realization that faster-than-light (ftl) propagation and communication are not only possible, but are apparently realized in the case of gravitational forces. See my two papers on “the speed of gravity” at our web site, the same location as given above.



CML: We read them before we wrote you, and they were very good reading indeed! (That’s why we got in touch.) Can you tell us how to obtain some of your other papers, which we’ve had a bit more trouble locating on the Meta Research site? We are particularly interested in obtaining the following papers:



Van Flandern, T., "Relativity with Flat Spacetime", MetaRes.Bull. 3, 9-13 [see <metaresearch.org> ] (1994).



Van Flandern, T., "Possible new properties of gravity", Parts I & H, MetaRes.Bull. 5, 23-29 & 38-50 [see <metaresearch.org>] (1996).



If you can provide us with information on how to subscribe to the MRB, we would like to do so. We will also pass this information on to our members.



TVF: Subscription information is at our web site under “Publications”, and we also send a subscription form with the complimentary issue anyone may request.



“Possible new properties of gravity” is published in Astrophys.&SpaceSci. 244, 249-261 (1996). The other article is only in the MRB. If you supply a mailing address, I’ll send it as a complimentary issue.



The “wrong turn” that physics took was to reason that special relativity (SR) proves that ftl propagation is impossible; that SR has been verified by eleven independent experiments; and therefore the universe has a speed limit. However, Lorentzian relativity (LR) passes the same eleven experimental tests, but differs from SR in that it allows ftl propagation. The only kind of experiment that can distinguish between the two theories is one that involves ftl phenomena. Gravitational forces now seem to provide that ftl phenomenon, and distinguish the two theories in favor of LR. Therefore, ftl is again allowed in physics, including quantum physics, where it is badly needed to achieve understanding.



CML: I agree that SR does not preclude nonlocal effects, i.e. physical correlations appearing to violate the principle of locality (“nothing travels faster than light”). On the other hand, suppose that superluminal travel is possible. Then if Galilean relativity were in effect, an observer M would measure the speed of light emitted from the “headlight” of a spaceship N flying by at relative velocity 2c, for example, to be at least double the speed of light emitted in the direction of N by M himself. In other words, the speed of light would no longer be “invariant”. So if superluminal travel is possible, it would seem that both Galilean and Einsteinian relativity are out the window. Does Lorentzian relativity, as distinguished from Galilean and Einsteinian Relativity, actually permit superluminal travel (as opposed to an extended relationship distinct from “travel”)?



TVF: Short answer: unqualified “yes”. But the question seems to imply that nature somehow will lead us to measure a relative speed of the two light beams of just c, and not 2c. However, that is not necessarily true. No experiment contradicts Galilean relativity.



It is important to keep in mind that the invariance of the speed of light is a postulate of special relativity, not an observed fact. It is unremarkable that the speed of light is independent of the speed of its source. That is true of all types of wave motion (e.g., sound). It would be remarkable if the speed of light could be shown to be independent of the speed of the observer, but it can’t. That property is simply postulated by Einstein. Then the clocks at the beginning and end of any light path of known length are synchronized in such a way that the speed (distance divided by time interval) over that length comes out to be c by construction.



So when the relative speed of two spacecraft approaching one another, each traveling at 90% of c relative to us, is measured, the measurement depends on how the clocks are synchronized. If Einstein’s prescription is used, the measurement will always be less than c. If Lorentz’s prescription is used (as it is, for example, with GPS satellites), then the measured speed will be 1.8 c.



CML: Nonlocality has been shown to be inconsistent with any theory obeying the

principles of locality, reality and induction. Yet conservation laws require it in the context of not only quantum mechanics, but celestial mechanics as well (where the conservation of angular momentum requires that gravitational effects be virtually instantaneous). Do you have a model supporting instantaneous propagation of forces? For example, have you considered Bohm's "holographic" interpretation of quantum mechanics in such a role?



TVF: As above, see my two papers on “the speed of gravity”. The first appeared in Phys.Lett.A, v. 250, pp. 1-11 (1998). A preprint of the second is at the web site.