Knowledge proxy: someone whose knowledge and judgment you substitute for your own.

Intellectual chain of custody (ICoC): a process that tracks the movement of ideas and sources from one person to another, back to their provenance as much as possible (this concept is alive in law and supply chain management, and I’m taking it for ideas now; I might also use “intellectual provenance,” and borrow from the art world).

We all need knowledge proxies, because none of us has the capacity to apprehend everything. We will need to select some people and rely on their judgments of fields and events beyond our scope of expertise and/or time.

We should not feel bad about not knowing everything, but we should be alarmed if we are not explicitly selecting our knowledge proxies.

Explicit selection means:

  • Paying active attention to the attitudes and track records of those you choose to trust, including who they choose as proxies themselves—this means tracking your ideas’ intellectual chain of custody.
  • Choosing to trust specific individuals deliberately, not just because you recognize their name or their position.
  • Being honest about what you know and what you rely on others to know. It is an intellectual posture of paying careful attention to what you know, and what you don’t. It is being responsible for your own opinions. It is agentic.
  • Solving, to some extent, the “how to evaluate experts in a field where you are not an expert”1 problem. Here I will quote the physicist Richard Feynman at length, because this is how I evaluate experts outside of my field: “The first principle is that you must not fool yourself—and you are the easiest person to fool. So you have to be very careful about that. After you’ve not fooled yourself, it’s easy not to fool other scientists. You just have to be honest in a conventional way after that.

    I would like to add something that’s not essential to the science, but something I kind of believe, which is that you should not fool the layman when you’re talking as a scientist. I’m not trying to tell you what to do about cheating on your wife, or fooling your girlfriend, or something like that, when you’re not trying to be a scientist, but just trying to be an ordinary human being. We’ll leave those problems up to you and your rabbi. I’m talking about a specific, extra type of integrity that is not lying, but bending over backwards to show how you’re maybe wrong, that you ought to do when acting as a scientist. And this is our responsibility as scientists, certainly to other scientists, and I think to laymen.”2

Implicit selection, which happens by default, means:

  • You don’t keep track of where you hear things, and by extension you cannot evaluate the validity or reliability of the knowledge that’s based on those things. You “read a study,” you “heard it somewhere,” you “saw a post about it.” You have no intellectual chain of custody, so there’s no telling what made-up, hyperbolic, or deliberately wrong things got slipped into your mind. You are more susceptible to mimesis.
  • You tend to overestimate your own levels of knowledge, because you have no specific list that limits your knowledge to yourself and various proxies. You just have yourself and a vague cloud. This lack of concreteness has horrible consequences.3
  • You will act as a vector for bad information, because you have kneecapped your own ability to validate what you think to be true. You can’t audit your intellect.

When thinking about knowledge proxies, ask yourself these two questions:

  1. Who are my proxies, and for what?
  2. For whom am I a proxy?

That second question is a doozy. It really makes you step back and think “What do I actually know, and what do other people rely on me to know, and in which fashion?” It forces you, more than the first question, to confront the responsibility we all have to be honest, and to think well. Whether or not you confront this reality directly, people rely on you to know (even if this takes the form of you implicitly endorsing, via not challenging them, when they confabulate).

My own answer to the second question, in part, is: I am a political knowledge proxy for many of my friends, Twitter acquaintances, and the students in The Foundations of New York. Although I have political areas that I know more about than others, I’m primarily a knowledge switchboard and guide. I point people in the right direction when they need political knowledge. I have many proxies in the political realm myself, and they appear voluminously in my footnotes (you can check these for examples).

If you don’t have the time to learn anything about politics at the moment, I will be your political proxy; by extension, I offer my own stable of political proxies, who have been vetting according to my own judgment. I’m not infallible, but I think I get most things right, and I can error-correct very well. This is no more than I expect for my own proxies.

The image used in this post’s social media thumbnail is Raphael’s School of Athens.


  1. You can verify common factual claims they make that are easy to check, you can see how they react to being challenged, you can look at past work and see how it holds up (and whether they revisited it if it’s wrong), etc. There’s a whole meta-skillset to evaluating experts outside of your own field. 

  2. (emphasis added) This is from Caltech’s 1974 commencement address, which he delivered. This is also the address that originated the modern popular usage of the term “cargo cult.” 

  3. See the second half of my essay The Anti-Concreteness Meme.