Mathematic Beauty

by Dave Sikkema

“Mathematics, rightly viewed, possesses not only truth, but supreme beauty — a beauty cold and austere, without the gorgeous trappings of painting or music.”Betrand Russell

As a 4th grade teacher in a Christian and classical school my job is to help students appreciate what is good, true, and beautiful about the world around them.  Naturally, most kids find the good, the true, and the beautiful in more popular subjects such as art and music (and, somehow P.E. and recess..) but they seem to find only the devil in long division and 3-digit multiplication.

Math, in their eyes, is a product of the Fall.  This is evident when they ask, through tears mostly, “Mr. Sikkema, why do I even need to know this!?” It seems to me they are really wondering, “What do all these numbers, and mathematical processes have to do with reality?”

This short video reminded me this morning that numbers have everything to do with reality.  Math is one of the many languages of God, and it offers a structure that is both good and true for the beauty we see around us.

math

Dave Sikkema is a 4th grade teacher at Regents School of Austin. This post was originally published on his blog “Backwards with Time” and is shared here with permission.

Operationalizing a Christian Foundation for Statistical Inference

by Andrew Hartley

Andrew Hartley is the author of Christian and Humanist Foundations for Statistical Inference; Religious Control of Statistical Paradigms. For more information on this work, please visit the Resource Book page. Guest author Steve Bishop posted an interview with Andrew as part of his series on Christian Mathematicians.

41JdVGeMu6L._SS500_In my book of 2008, Christian and Humanist Foundations for Statistical Inference (Resource Publications, hereinafter, Foundations), I sketched out what I see as a biblically consistent philosophy of statistics. Since then, a few statisticians and mathematicians have requested a framework for “operationalizing” that philosophy. What, they ask me, does the philosophy entail practically for the Christian doing statistical inference or teaching it, in the present age? This post takes a first pass at addressing these requests without, however, offering a self-contained explanation that can stand apart from that book, or from a suitable background in reformational philosophy (sphere sovereignty, sphere universality, etc.).

At the outset, I should set some realistic expectations, concerning the faith-statistics integration that is possible now. As will become clear below, fully implementing the philosophy of statistics introduced in 2008 is impossible without a large-scale reformation of both the producers & the consumers of statistical reports and results. Only so much can be done in this fallen world where sin crosses all boundaries and infects all people. As you and I consider “Christian statistics,” we should, therefore, consider the differences between how we might perform this statistics now, and how we might do it in the New Jerusalem, with our “new bodies.” Caveat lector.

This post requires qualification also in that the present times—the early decades of the 21st Century—impose their own distinctive threats to a Biblically consistent statistics, so that the features I have stressed of such a statistics might not be the ones to stress in a different time and place. The humanistic “science” & “personality” motives have motivated much of the foundations and practice of present-day statistics; therefore, as we outline a statistics consistent with the Philosophy of the Law Idea (PLI), we are compelled to emphasize elements of the PLI that address those motives. No one should be surprised if, decades or centuries from now, different threats arise, necessitating different Christian responses.

Discussion of the main topic here is set in the narrow context of the classroom, for that setting may offer the best chance of transforming statistical practice. A philosophy of statistics deals, after all, with fundamentals; and statistics educators take on a primary role in introducing us all to statistical fundamentals. Therefore, this post is framed with undergraduate college professors and high school teachers in mind, and offers examples and explanations that, hopefully, he or she can use easily in lectures, small group discussions and student exercises without a great deal of modification. Furthermore, this post assumes that the students, too, are seeking to conceptualize and practice statistics in ways that honor the Lord; indeed, much of what this post promotes is impossible without such an attitude, regardless of the efforts of the educator.

Some Christian principles for our vocations and sciences that should be conveyed in the classroom are as follows:

  1. All aspects (kinds of laws and properties) of human experience (quantitative, spatial, kinetic,…) are equally valid, important, and dependent upon God. No aspects can be reduced to any others
  2. All the aspects, though mutually non-reducible, are nonetheless connected. As Herman Dooyeweerd has said, all functioning is “analogical”
  3. Sin has affected humans—and, thus, the world we are called to care for as vice-gerents—in all the aspects, though in their directions, not their structures
  4. God’s redemptive work, through the cross, frees His people to reclaim what sin has stolen, and liberate the earth from its frustration, progressively, albeit with fits and starts
  5. Science in general, and statistical inference in particular, must be founded on pre-scientific experience (what Dooyeweerd calls the “naïve attitude of thought”), to enhance rather than replacing that experience

What follows here assumes that the reader is familiar already with these principles; hence, it does not explain them or even cite many of the wonderful lectures and books that lay them out.

I propose “operationalizing” those principles in an introductory statistics course using a fairly lengthy illustration—a story, if you will—although this post gives only an outline for the illustration. It introduces the discipline of statistics by discussing a subjective decision analysis problem, first without empirical data, & then—bringing in Bayesian analysis—with such data. Presenting the entire illustration will require several weeks, and multiple detours to clarify concepts more difficult to grasp. In the end, though, if successful, the illustration will give a broad overview of statistical inference (purposes, foundations, ways of thinking, and so on) while aiding the statistician in modeling the above principles.

Here are some insights, consequences of the principles mentioned above, that should surface—and be emphasized—as the teacher discusses the illustration with students:

  1. Founding Inference on the Pre-scientific “Naïve Attitude:” Statistical inference requires combining new data with previous knowledge. Data alone do not suffice as the basis for making statements about the unknown based on the known; rather, prior beliefs should influence post-analytic belief. Statistical inference—like other scientific activities—should enhance, and not replace, everyday ways of thinking (Dooyeweerd’s “naïve attitude”). To the extent the data do not overwhelm (“swamp”) prior belief, statistical inference depends on forming reasonable, realistic priors. Thus, inference is best when those forming the priors are experts in their subject matter, honest and level-headed with their assessments of prior evidence and the applicability of that evidence to novel situations, and willing and able to distinguish between reasonable beliefs and the beliefs that would benefit them personally. An awareness of the importance and limitations of prior opinion will, in later, more detailed, theoretical discussions, ease students into discussion of Bayes Theorem. This insight helps to inoculate students against the humanist “control” motive. Jan C. Geertsema, too, has discussed the necessary founding of statistical inference in the “naïve attitude,” borrowing from Stoker the idea of the “contextual view of science.”
  2. Objective Implications of Data: Prior opinion affects statistical inference; however, the data, too, have an impact, and the more data that are available, the more precise (in most inferential situations) are point and interval estimates. Thus, data place constraints on justifiable scientific belief. This seems to cohere well with the PLI, which implies that all things are subject to the laws of all the aspects all the time, implying in turn that beliefs (scientific or otherwise), while qualified by laws of the pistic (fiducial) aspect, are subject too to quantitative laws. In other words, contrary to the humanist “freedom” or “personality” motive, norms exist for beliefs, & we are not free to believe whatever we wish.
  3. Decisions: If we take as given that decision making is rational when it maximizes overall expected net benefit, then this decision making requires incorporation of utilities, that is, the possible, but sometimes only partly certain, costs and benefits of the candidate decisions. The expected net benefit of a decision is a synthesis of the quantitative sizes of the (usually multiple) possible payoffs, and the probabilities of those payoffs. In this way, even if a particular possible state of nature is extremely unlikely, acting as if it is true might be prudent. For instance, if a patient exhibits symptoms consistent with a rare and deadly medical disease, treating the patient for the disease may be sensible even if the patient is unlikely to be infected with the disease. This insight draws into statistics economic laws and properties, that is to say, laws and properties about investments, sacrifices and rewards; therefore, it points up an additional means by which statistics must acknowledge a multiplicity of aspects.

What follows here, then, is the outline of the illustration. An instructor would, in most teaching contexts, flesh out the outline with numerous other examples & more details:

  • Every part of human life involves risk-taking decisions, i.e., allocating resources in the presence of uncertainty about the outcomes of the allocations. Many high-school graduates, for instance, must decide whether to begin working full-time immediately, or to postpone work to attend college in the hopes that additional education will yield additional opportunities and rewards several years hence.
  • We generally want to make the decisions which maximize expected rewards or, in other words, which minimize risk.
  • We handle most such decisions best through intuition & common sense, without any formal statistical analysis. In these cases, we don’t engage in what Roy A. Clouser calls “abstraction” or, much less, “high abstraction,” so that our deciding remains in the “naïve attitude.” E.g., when I make such minor decisions as whether to store a small sum of money in my wallet or in a bank, I do so quite informally. I fairly quickly & casually consider my own subjective feelings (which are qualified by Dooyeweerd’s “sensory” aspect) about the probability—viz., strength of belief—that, say, I will lose my wallet relative to the probability that the bank will fail. Those probabilities certainly possess quantitative laws and properties, but they involve laws and properties of every other aspect, too, and I do not focus on (or “abstract out”) the quantitative ones to the exclusion of the others.
  • On the other hand, if we did demand abstract quantitative thinking for even the most minor of decisions, we would cause numerous detriments. Obviously, we would slow daily life almost to a standstill; perhaps less obviously, though, we would also risk neglecting, or giving short shrift to, aspects of the decision other than the quantitative one, e.g., losing sight of the
    • social impacts of the decision, such as the social interactions I enjoy when walking to the bank
    • chemical impacts of the decision, such as the pollution I might cause if I drove my automobile to the bank
    • ethical impacts of the decision, such as the ability of the bank to loan more money to needy businesses when I entrust the bank with my money

As long as we remain in the naïve attitude, we may be better able to appreciate these and other impacts of our decisions which are qualified by a wide array of aspects. So, one of the first main messages to convey to statistics students is that statistical reasoning can enhance everyday, pre-scientific experience, but we should not insist on “scientistically” imposing that reasoning indiscriminately.

  • However, when the implications of making a sub-optimal decision are serious (i.e., major loss of life or property), then probabilistic & statistical reasoning are often justified, to improve the chances of identifying decisions that are optimal from a strictly quantitative perspective. That is, such reasoning, though it does require some expertise & time, is available to enhance our intuition. That enhancement can, in turn, enrich human life, as long as quantitative findings are, subsequently, integrated properly back into the multi-aspectual fabric of everyday existence.
  • As an example of one of these more consequential decision-making contexts, imagine a society or a large for-profit firm deciding whether to embark on a trip to a far-away planet, in the hopes of locating—and returning to Earth with—a large quantity of a precious metal. Such an endeavor carries the potential of great rewards, but also entails great risks to both life and property; hence, deciding whether to attempt the mission deserves careful consideration & systematic analysis.
  • Say that, upon analysis of the possible outcomes of the trip, scientists determine that
    • The probability of success is 47%.
    • The reward, upon success, would be $50 million
    • The cost of the trip would be $25 million
  • Given these costs, benefits & so on, the expected net benefit of the trip would be $50*0.47 – 25 = -1.5 million, which is less than 0; hence the trip seems unjustified, at least from a financial perspective (recognizing, though, that many other considerations together might nonetheless justify the trip). Later, if the statistics course is advanced, the instructor might justify, using so-called “Dutch Book” arguments, decision making using this type of probabilistic rules.
  • This discussion has illustrated decision analysis under uncertainty WITHOUT random data informing the decision. One might call it “statistical” decision making; however, no “statistics” are involved, so if it needs a name, we might refer to it better as “probabilistically supported” decision making.
  • On the other hand, as an example of decision making under uncertainty informed by random data, suppose data were collected on successful & unsuccessful space missions. Among a sample of historical missions, 85% (say) were successful. This summary statistic changes the analysis of the decision by updating the probability of success for the prospective mission. Such updating leads the class, in turn, to Kolmogorov’s definition of conditional probability, and Bayes Theorem (which can be taken as a consequence of that definition).
  • Suppose as well that, following Bayesian reasoning, the probability of success for the mission is updated from the prior 47% to the posterior 63%. This implies the expected net benefit of the trip has changed to $6.5 million, so that the trip is now justified financially.
  • The instructor of an advanced statistics class might, at this point, discuss briefly with students the principle of stable estimation, the upshot of which is that, as the number of data and/or their precision increase, the impacts of a wide range of prior probability distributions decrease. In other words, the principle shows that even “two people with widely divergent prior opinions but reasonably open minds will be forced into arbitrarily close agreement about future observations by a sufficient amount of data” (Edwards et al., 1963). While students in introductory statistics courses do not need to learn every facet of this principle, they should know it exists and a basic overview of its implications. This will help them appreciate the principle that data place limitations on beliefs.
  • In summary, we take risks and seek to maximize expected returns on our efforts & investments, whether we are deciding whether to store a little money in a bank or deciding whether to embark on a mission to another planet. The first scenario calls, though, for intuitive, informal judgment, whereas the second calls for comparatively careful, analytic measurements & comparisons of risks & rewards. The levels of abstraction and study appropriate for these scenarios fall along a continuum. Many other points along the continuum exist, too; less formal approaches would be appropriate when deciding whether to build a fence around the perimeter of a one-family house to keep out potential intruders, but more formal analyses would be helpful when deciding whether to research & develop a new anti-diabetes drug. Informal intuition is most beneficial, one might argue, when decisions must be made quickly and/or the possible losses of sub-optimal decisions are small. Formal analyses are useful, though, when sufficient time & expertise are available for them, & the possible gains of optimizing the decision are large. All statistical methods strike some balance between these types of intuition & formalism.
  • When the practicing statistician wishes to perform inference or assess potential decisions, & needs to select reasoning & methods appropriate for the situation, awareness of this continuum of formalism—as we might call it—might remind him/her that each of the various levels of formalism conveys costs and benefits. It also sets a context, though, for appreciating the potential usefulness of some statistical methods that, for any of a variety of reasons, do not provide ideal results, but are cost-effective in their ease of computation. The results of some methods based on asymptotic distributions, for instance, have higher than necessary standard errors or even biases, but are quickly derived by existent computer software or even by hand calculators. On the other hand, as could be conveyed next to students, some frequentist statistical results, though they are statements about data given parameters, can be re-interpreted as Bayesian—and, therefore, inferential—results, approximately at least.
  • Having introduced statistical inference using Bayes Theorem, the instructor can turn to a discussion of statistical inference using frequentist statistics. The intermediate or advanced statistics student must understand Neyman-Pearson frequentist reasoning, & Ronald A Fisher’s hybridization of frequentist & Bayesian reasoning (if the latter can be “understood” & not only “felt”); however, discussions of these approaches & their underlying philosophies should aim primarily to show how their results can, in certain circumstances, be interpreted inferentially, viz., Bayesianly. The possibility of such correspondences is fairly easy to convey to students familiar with calculus; they can be shown that, when making a posteriori inferences about a single normal mean m with a known standard deviation, given a flat prior distribution, a sample, and a sampling stopping rule independent of the data,
    • the maximum likelihood estimator equals the posterior mean,
    • the p-value of the one-sided hypothesis H0: m≤0 equals the posterior probability that m≤0,
    • the 95% symmetric confidence interval for m equals the 95% highest posterior density interval for m.

The student will then appreciate why frequentist methods, despite being developed using deductive (rather than inferential) reasoning, sometimes do support the progress of science & decision making. S/he will also have access to a larger “toolbox” of statistical methods, some of which are easier to implement than are Bayesian ones and are, yet, at least almost as accurate as the latter.

  • All students should be informed, though, that such Bayesian-frequentist correspondences do not hold, or cannot be verified to hold, if certain conditions are not met. They often do not hold, for instance, if Bayesian priors are not flat, if the sampling rule depends on the data, or if either the parameter space is discrete & the sample space is continuous, or vice versa. Therefore, if the statistician interprets frequentist results inferentially, s/he should do so discriminately, ensuring that their inherently deductive meanings can be re-interpreted inductively, at least approximately.  The challenges of performing these checks can impede implementing frequentist results, though, to the extent that the statistician might opt for Bayesian ones instead.

The above pedagogy will, I hope, convey the principles for statistical inference I’ve outlined in Foundations, & suggest how that inference can be performed so as to cohere with those principles. I do want to circumvent a possible misunderstanding, though: The educator & the student should not take the use of subjective Bayesian inference above as an indication that Christian statistics is necessarily Bayesian. Foundations tried to show that this inference is appealing due to its direct, inductive statements about hypotheses given data & its founding on Kolmogorov’s (what seems to me) compelling, self-evident definition of conditional probability, & because it seems to permit non-reductive inference. Nonetheless, as a human formulation, subjective Bayesian inference is in need of reformation because, if nothing else, sin has impaired our ability to construct realistic priors. We have trouble distinguishing what we believe from what we hope to be true. Even our desire to reflect our true beliefs in priors is weakened; for, we sometimes indicate we believe something merely because we want others to believe it or because we want the conclusion of an analysis to benefit us. Plainly, then, bayesianism cannot serve as a panacea for statistics in all respects. In any case, however, when statistical approaches other than subjective Bayesianism are proposed, I hope that the principles for statistics illustrated above could constitute a starting place for judging whether they are more suitable.

I would welcome your comments and questions on this post; if you would like to collaborate with me on refining it or on subsequent related research, please ask Josh Wilkerson (jwilkerson<at>godandmath<dot>com) to send me your contact information.

REFERENCE

Edwards W, Lindman H, Savage LJ. Bayesian Statistical Inference for Psychological Research. Psychological Review 70:3, 193-242.

Why Math Works

by John D. Mays

Back in 1999 when I began teaching in a classical Christian school, one of the first books I heard about was James Nickel’s little jewel, Mathematics: Is God Silent? Must reading for every Christian math and science teacher, the book introduced me to a serious problem faced by unbelieving scientists and mathematicians. Stated succinctly, the problem is this: Mathematics, as a formal system, is an abstraction that resides in human minds. Outside our minds is the world out there, the objectively real world of planets, forests, diamonds, tomatoes and llamas. The world out there possesses such a deeply structured order that it can be modeled mathematically. So how is it that an abstract system of thought that resides in our minds can be used so successfully to model the behaviors of complex physical systems that reside outside of our minds?

For over a decade now this problem, and the answer to it provided by Christian theology, has been the subject of my lesson on the first day of school in my Advanced Precalculus class. But before jumping to resolving the problem we need to examine this mystery – which is actually three-fold – more closely.

In his book Nickel quotes several prominent scientists and mathematicians on this issue. In 1960, Eugene Wigner, winner of the 1963 Nobel Prize for Physics, wrote an essay entitled, “The Unreasonable Effectiveness of Mathematics in the Natural Sciences.” Wigner wrote:

The enormous usefulness of mathematics in the natural sciences is something bordering on the mysterious and…there is no rational explanation for it…It is not at all natural that ‘laws of nature’ exist, much less that man is able to discern them…It is difficult to avoid the impression that a miracle confronts us here…The miracle of appropriateness of the language of mathematics for the formulation of the laws of physics is a wonderful gift which we neither understand nor deserve.

Next Nickel quotes Albert Einstein on this subject. Einstein commented:

You find it surprising that I think of the comprehensibility of the world…as a miracle or an eternal mystery. But surely, a priori, one should expect the world to be chaotic, not to be grasped by thought in any way. One might (indeed one should) expect that the world evidence itself as lawful only so far as we grasp it in an orderly fashion. This would be a sort of order like the alphabetical order of words of a language. On the other hand, the kind of order created, for example, by Newton’s gravitational theory is of a very different character. Even if the axioms of the theory are posited by man, the success of such a procedure supposes in the objective world a high degree of order which we are in no way entitled to expect a priori.

One more key figure Nickel quotes is mathematician and author Morris Kline:

Finally, a study of mathematics and its contributions to the sciences exposes a deep question. Mathematics is man-made. The concepts, the broad ideas, the logical standards and methods of reasoning, and the ideals which have been steadfastly pursued for over two thousand years were fashioned by human beings. Yet with this product of his fallible mind man has surveyed spaces too vast for his imagination to encompass; he has predicted and shown how to control radio waves which none of our senses can perceive; and he has discovered particles too small to be seen with the most powerful microscope. Cold symbols and formulas completely at the disposition of man have enabled him to secure a portentous grip on the universe. Some explanation of this marvelous power is called for.

The first aspect of the problem these scientists are getting at is the fascinating fact that the natural world possesses a deep structure or order. And not just any order, mathematical order. It is sometimes difficult for people who have not considered this before to get why this is so bizarre. Simply put, the order we see in the cosmos is not what one would expect from a universe that started with a random colossal explosion blowing matter and energy everywhere.

Many commentators have written about this and professed bafflement over it. All of the above quotes from Nickel’s book and many, many more are included in Morris Kline’s important work, Mathematics: The Loss of Certainty, which explores this issue at length. In his book The Mind of God, Paul Davies, an avowed agnostic, prolific popular writer and physics professor, takes this issue as his starting point. Davies finds the order in the universe to be incontrovertible evidence that there is more “out there” than the mere physical world. There is some kind of transcendent reality that has imbued the Creation with its mathematical properties.

The second aspect to the problem or mystery we are exploring is that human beings just happen to have serious powers of mathematical thought. Now, although everyone is happy about this, I rarely find anyone who is shocked by it. Christians hold that we are made in the image of God, which explains our unique abilities such as the use of language, the production of art, the expression of love, self-awareness, and, of course, our ability to think in mathematical terms. Non-Christians don’t accept the doctrine of the imago Dei, but seem to think that our abilities can all be explained by the theory of natural selection.

But hold on here one minute. Doesn’t it seem strange that our colossal powers of mathematical imagination would have evolved by means of a mechanism that presumably helped us survive in a pre-industrial, pre-civilized environment? Our abilities seem to go orders of magnitude beyond what evolution would have granted us for survival.

I know all about the God-of-the-gaps argument, and I’m not going to fall for it here. It may be that some day the theory of common descent by natural selection will be able to explain how we became so smart. That’s fine, and I’m not threatened by it. All I’m saying is that for now Darwinism still has a lot of explaining to do. And getting back to the concerns in this essay, I for one do not take Man’s amazing intellectual powers for granted. They are wonderful.

The third aspect to our problem is the most provocative of all. Mathematics is a system of symbols and logic that exists inside of our heads, in our minds. But the physical world, with all of its order and structure, is an objective reality that is not inside our heads. So how is it that mathematical structures and equations that we dream up in our heads can correspond so closely to the law-like behavior of the independent physical world? There is simply no reason for there to be any correspondence at all. It’s no good saying, Well we all evolved together, so that’s why our thoughts match the behavior of reality. That doesn’t explain anything. Humans are a species confined since Creation to this planet. Why should we be able to determine the orbital rules for planets, the chemical composition of the sun, and the speed of light? I am not the only one amazed by this correspondence. All those Nobel Prize winners are amazed by it too, and they are a lot smarter than I am. This is a conundrum that cannot be dismissed. John Polkinghorne said it well in his Science and Creation: The Search for Understanding:

“We are so familiar with the fact that we can understand the world that most of the time we take it for granted. It is what makes science possible. Yet it could have been otherwise. The universe might have been a disorderly chaos rather than an orderly cosmos. Or it might have had a rationality which was inaccessible to us…There is a congruence between our minds and the universe, between the rationality experienced within and the rationality observed without. This extends not only to the mathematical articulation of fundamental theory but also to all those tacit acts of judgement, exercised with intuitive skill, which are equally indispensable to the scientific endeavor.” (Quoted in Alister McGrath, The Science of God.)

Which brings us to the striking explanatory power of Christian theology for addressing this mystery. As long as we ponder only two entities, nature and human beings, there is no resolution to the puzzle. But when we bring in a third entity, The Creator, the God who made all things, the mystery is readily explained. As the figure here indicates, God, the Creation, and Man form a triangle of interaction, each interacting in key ways with the other. God gives (present tense verb intentional) the Creation the beautiful, orderly character that lends itself so readily to mathematical description. And we should not fail to note here that the Creation responds, as Psalm 19 proclaims: “The heavens declare the glory of God.” (I have long thought that when the Pharisees told Jesus to silence his disciples at the entry to Jerusalem, and Jesus replied that if they were silent the very stones would cry out, he wasn’t speaking hyperbolically. Those stones might have cried out. They were perfectly capable of doing so had they been authorized to. But I digress.)

why math worksSimilarly, God made Man in His own image so that we have the curiosity and imagination to explore and describe the world He made. We respond by exercising the stewardship over nature God charged us with, as well as by fulfilling the cultural mandate to develop human society to the uttermost, which includes art, literature, history, music, law, mathematics, science, and every other worthy endeavor.

Finally, there is the pair of interactions that gave rise to the initial question of why math works: Nature with its properties and human beings with our mathematical imaginations. There is a perfect match here. The universe does not possess an order that is inaccessible to us, as Polkinghorne suggests it might have had. It has the kind of order that we can discover, comprehend and describe. What can we call this but a magnificent gift that defies description?

We should desire that our students would all know about this great correspondence God has set in place, and that considering it would help them grow in their faith and in their ability to defend it. Every student should be acquainted with the Christian account of why math works. I recommend that every Math Department review their curriculum and augment it where necessary to assure that their students know this story.

John D. Mays is the founder of Novare Science and Math in Austin, Texas. He also serves as Director of the Laser Optics Lab at Regents School of Austin. John entered the field of education in 1985 teaching Math in the public school system. Since then he has also taught Science and Math professionally in Episcopal schools and classical-model Christian high schools. He taught Math and 20th Century American Literature part-time at St. Edwards University for 10 years. He taught full-time at Regents School of Austin from 1999-2012, serving as Math-Science Department Chair for eight years. He continues to teach on a part time basis at Regents serving as the Director of the Laser Optics Lab. He is the author of many science textbooks that I invite you to explore further on the Novare website.