You are reclining comfortably outdoors late on a sunny afternoon. Your eyes roll lazily over the calm sky until they come to rest upon a big, fluffy white cloud. The fading sunlight falls across it so as to emphasise its sharp edges as it floats dreamily across the soft blue backdrop.

The cloud is so captivating that you charter a plane in order to take a closer look at it before the sun sets. You rush headlong into the sky. However, as you reach your objective its marvellous clarity disappears, and so does your enchantment! Eventually, all you perceive is a fog. And you have no clear idea of where it begins and ends. You feel cheated, though perhaps you should have known better. You remember what they told you at school; clouds are simply collections of airborne water droplets that are sufficiently close to one another. So it’s no wonder that on closer inspection the cloud lost its prior appearance of possessing sharp boundaries.

The plane lands. As you step onto firm ground, a puzzling thought occurs to
you. If the cloud is just a collection, or aggregate, of water droplets
configured in a certain way, then *which* aggregate is it? It seems that
our concept of cloudhood just isn’t fine-grained enough to isolate one
collection as *the* cloud. At the margins of the cloud, the
concentration of water droplets decreases gradually. Hence, there are many ways
of drawing the boundaries of the cloud which seem equally good. And corresponding
to each of these ways of bounding the cloud is a collection of water droplets.
Thus, there seem to be many collections of water droplets that satisfy the
conditions for cloudhood. But if this is so, then it would appear that where
you originally thought there was one cloud, there are actually a multitude of
clouds.

With your feet planted firmly on the ground but your head still somewhere in
the sky, you place a hand in your pocket. Your fingers close around a cool
metal object—a dollar coin, in fact. Suddenly, you are drawn back to reality.
You realise that you have no means of paying for your flight. In your haste to
charter the plane before sundown you forgot to take any means of payment with
you. Then it occurs to you that dollar coins are a lot like clouds. You thought
you only had one of them in your pocket, but you must have many *thousands*,
at least! The atoms that make up the coin in your pocket are continually
exchanging particles with atoms outside the coin. So, we may ask, which
aggregate of particles is the dollar coin? At the margins of the coin there are
many particles for which it is a matter of indifference whether they are
considered part of the coin or part of the coin’s surroundings. Therefore,
there are many equally adequate ways of marking the boundary of the coin. And
therefore, there are many different aggregates of particles that make equally
good coins. It would seem that you have thousands of dollar coins in your
pocket. Yet these thoughts offer you little comfort as the pilot smiles and
politely enquires about your fare.

Peter Unger has termed the puzzle we have just been considering *the
Problem of the Many*.[1]
Where we think there is only one thing of a certain sort, there turn out to be
many such things. Or, perhaps, there are none. Unger himself suggests that a
better label might have been the Problem of the Many *or the None*; if
you find it too hard to accept that there are always many clouds in situations
where we would normally say that there is only one cloud, you should conclude
with Unger that there are no clouds whatsoever. You must make your choice
between these alternatives, Unger thinks, but what you can’t do is say that
there is just one cloud in these situations.[2]
Moreover, if the Problem of the Many is a problem in the case of clouds, then
it is, as we have just seen, a problem in the case of coins. But there is
nothing special about coins. For, just as coins are collections of particles,
so are all other everyday physical objects. Thus, the Problem of the Many turns
out to be quite general.

I take it that most of us would not favour Unger’s conclusion that there are no everyday physical things. Must we, then, say that there are many things where we thought there was just one? Or can we avoid Unger’s dilemma and retain our everyday views about everyday things? The main purpose of this paper is to decide whether supervaluations can help us avoid the dilemma. I will argue that they cannot. I will also suggest that my arguments against the use of supervaluations count against the solution to the Problem of the Many offered by those who endorse an epistemic theory of vagueness.

None of the other solutions currently offered in the literature are unproblematic. Or so it seems to me. Perhaps we need something new. In any case, as I will now explain, supervaluations don’t help.

Supervaluations have been in favour for some time as a means of dealing with
vague predicates. Vague predicates are predicates that have (or more
accurately, *can* have) borderline cases of application. For example,
the predicate ‘is inflated’ is vague. If the air pressure inside a soccer ball
falls within a certain range, then it counts as being inflated. If the air
pressure inside the soccer ball falls within a certain lower range, then it
counts as being uninflated. These two ranges are not contiguous; there is a gap
between them. If the ball’s internal air pressure falls within this gap, then
the ball counts as a borderline case of the predicate ‘is inflated’: the ball
is neither inflated nor uninflated, but is somewhere inbetween.

Those who find supervaluational treatments of vagueness appealing consider
vagueness to be a thoroughly semantic phenomenon; that is, they think that
there are vague predicates and concepts, but no genuinely vague properties or
objects. David Lewis, who is a representative of this view, speaks of vague
predicates as involving *semantic indecision*. A vague predicate is
imperfectly decisive; it does not exhaustively divide the world into things
that satisfy it and things that fail to satisfy it. For each thing that is left
over, it is undecided whether or not the predicate applies to that thing. The
attractiveness of supervaluational treatments of vague predicates is seen to
lie in their conservatism: it is argued that supervaluationism accommodates the
borderline cases that vague predicates engender, while avoiding the weakening
of classical laws such as Excluded Middle and Non-Contradiction.

It will be useful here to briefly describe how supervaluational treatments
of vague predicates work. Many soccer balls are uninflated (e.g. those that lie
in warehouses, waiting to be sent to retail outlets). And, obviously, many
soccer balls are inflated—such inflation is in most cases a precondition of a
worthwhile game of soccer. However, some soccer balls occupy the ‘grey area’.
They are neither inflated nor uninflated. A supervaluational treatment of ‘is
inflated’ involves first noticing that there are a number of permissible ways
of extending our notions of inflation and uninflation so that each of the
intermediate cases count as either inflated or uninflated. Each permissible way
draws a line somewhere along the range of indeterminate cases and assigns
inflation to those on the inflated side of the line and uninflation to those on
the uninflated side of the line. These extensions are often called *sharpenings*
or *precisifications* of the original notions. Next, we say it is
super-true that *x* is inflated iff it is true on all permissible
sharpenings that *x* is inflated, it is super-false that *x* is
inflated iff it is false on all permissible sharpenings that *x* is
inflated, and it is super-indeterminate that *x* is inflated iff it is
true on only some permissible sharpenings that *x* is inflated.

Indeterminacy is retained in our original imprecise language since there are
soccer balls that are inflated on some sharpenings only. Furthermore, the laws
of Excluded Middle and Non-Contradiction are also preserved: every sharpening
makes [Inflated]*x* Ú Ø[Inflated]*x* true, though not in a uniform way.
The same applies for Ø( [Inflated]*x* Ù Ø[Inflated]*x* ).
Sorites problems are addressed by claiming that there is indeed a most inflated
uninflated ball, since there is such a ball on every permissible sharpening.
This, it is said, is not to deny the very vagueness of ‘is inflated’ and ‘is
uninflated’. Such a denial, it is held, requires there to be some ball for
which it is the case that *it* is the most inflated uninflated ball. But
there is no such ball, since no particular ball is the most inflated uninflated
ball on more than one permissible sharpening. Finally, as Lewis puts it,
‘Super-truth, with respect to a language interpreted in an imperfectly decisive
way, replaces truth *simpliciter* as the goal of a cooperative speaker
attempting to impart information.’[3]

Returning to Unger’s dilemma, let us first look in a little more detail at how it arises. The supervaluational solution will then be outlined. Consider these two principles:

(1) Coins are distinguished from non-coins by their physical characteristics (e.g. shape, size and design).[4]

(2) If two things significantly overlap then they are not both coins.[5]

If we accept (1) but not (2), we get the conclusion that there are many
coins in your pocket. And (1) appears highly plausible. But there is pressure
to accept (2) as well. Acceptance of some such principle seems to be the only
way to ensure that there is *at most* one coin in your pocket. But we
can’t accept both (1) and (2). (1) entails that all of the suitable candidates
for coinhood in your pocket are coins, since they all have the requisite
physical properties, while (2) entails that at most one of the suitable
candidates is a coin. If our concept of coinhood is governed by both (1) and
(2) then our concept of coinhood is incoherent and there are no coins.
Acceptance of (2) alone doesn’t seem to be an option either, since (2) by
itself entails nothing about how many coins there are in your pocket.

The supervaluational solution diagnoses the Problem of the Many as a problem of semantic indecision. We have never needed a concept of coinhood precise enough to distinguish between the various contenders in your pocket. So the semantic decision remains unmade. The predicate ‘is a coin’, for instance, has many clearly negative instances, such as those things which are not coin-shaped. And, according to the supervaluationist, it has no clear positive instances; your pocket contains lots of borderline cases of coins.

However, the supervaluationist promises to give us everything we want. The
supervaluational treatment licenses us to say that there is just one coin in
your pocket. And it also licenses our assent to (1) and (2). For any borderline
case in your pocket, *x*, there is a permissible way of sharpening ‘is a
coin’ so that it applies to *x* but not to any of the other borderline
cases in your pocket. Now, since it is true according to every permissible
sharpening that there is just one coin in your pocket, it is super-true that
there is just one coin in your pocket. Moreover, (1) is true on every
sharpening of ‘is a coin’, as is (2). So both (1) and (2) are super-true.

Before discussing the merits of the supervaluational approach, I want to
address some concerns that might arise regarding my exposition of the Problem.
Some people might baulk at my speaking of *the predicate* ‘is a coin’.
After all, isn’t the Problem of the Many usually discussed in terms of names,
like ‘Tibbles’, or descriptions, such as ‘the coin in my pocket’? It might be
wondered whether it is a mistake to treat the problem as one involving predication.

In response, note that even if it were true that the the Problem of the Many is not explicitly made out in terms of predication, there are entailments including the predicate ‘is a coin’ that ought to hold. Thus, for instance, ‘The coin in my pocket is round’ entails ‘There is one thing in my pocket which is a coin and is round’.[6] However, I suspect it is going to be difficult to even state the Problem without predicates like ‘is a coin’ or their surrogates. Consider these examples from the literature, where the Problem is being expressed in terms of clouds:

...it seems clear that no matter which relevant concrete complex is deemed fit for cloudhood, that is, is deemed a cloud, there will be very many others each of which has, in any relevant respect, a claim that is just as good.[7]

It is, therefore, entirely arbitrary to pick on particular aggregate and insist that that is the cloud. But if all of them count as clouds then there is not one cloud in the sky but many, contrary to our initial supposition.[8]

Since they have equal claim, how can we say that the cloud is one of these aggregates rather than another? But if all of them count as clouds, then we have many clouds rather than one.[9]

Note the expression ‘count as clouds’ that appears in the extracts from Lewis and Tye. Here, we might legitimately substitute ‘exemplifies the concept of cloudhood’, or its linguistic correlate, ‘satisfies the predicate “is a coin” ’. And in the extract from Unger it is clear that the problem is being set up in terms of cloudhood being predicated of concrete complexes. The reason the Problem seems to have bite is that there are many candidates which appear to satisfy the conditions for cloudhood, or which in other words, appear to satisfy the predicate ‘is a cloud’.

Let us turn, then, to the status of supervaluationism as a solution to the Problem of the Many. I will argue that supervaluationism, considered as solution to this problem, is exposed to serious objections.

**3 A Problem for Supervaluationism**

**3.1 Principled Sharpenings**

The night after your plane trip to the sky you sleep restlessly. You experience surreal visions of coins all through the night—coins glinting in the sun; coins spinning through the air; a sea of coins in a miser’s hoard. Upon waking you find yourself to be feeling no less impulsive than you were the previous day. You decide to indulge your new fixation and visit a coin exhibition. You arrive. ‘2547 Coins from Every Time and Place that Matters’, the sign blares. Being in a suspicious frame of mind you count them all, only to discover that the sign was right.

When we think about the Problem of the Many, it is easy to forget that a permissible sharpening has to do more than just cater for the truth of statements like, ‘There is one coin here’. It is easy to forget this because the Problem of the Many is presented in the literature only in terms of single cases—single clouds, single coins and the like. Yet, the role of a permissible sharpening is much more exacting than this. Suppose that as you make your way through the coins on display, your eye is particularly caught by a pretty, gleaming shilling. A permissible sharpening of ‘is a coin’ must be sure to say that there is just one shilling that has particularly caught your eye. But there is also a much broader truth that any permissible sharpening must allow for, namely, that there are 2547 coins in the exhibition.

Imagine that *just one* of the supposedly permissible sharpenings
yielded a different figure for the number of coins on display. It would turn
out that the number of coins in the exhibition is not determinate. This follows
from the fact that any disagreement between permissible sharpenings over the
question of how many other coins are in the exhibition translates into
indeterminacy when we supervaluate. Indeterminacy of this kind would be most
unwelcome here. And it is not just indeterminacy that the supervaluationist
needs to avoid. If every permissible sharpening agrees that there is some
number of coins in the exhibition that differs from the number you counted, we
should be equally dissatisfied.

We may summarise the situation for the supervaluationist as follows. Corresponding to each of what we would ordinarily describe as the 2547 coins in the exhibition, there is cluster of aggregates of particles. To make the situation a little more concrete, let’s imagine that that just after you have counted the coins, you exclaim, ‘There really are 2547 coins in the exhibition after all!’. Furthermore, let us suppose that at a certain instant while you are making this exclamation, each coin in the exhibition is in the process of donating four hundred electrons to the surface on which it is mounted. Moreover, in each case, most of the four hundred electrons are neither definitely parts of the coin nor definitely parts of its mounting. Notice that for each coin, there are many more coin-candidates than doubtful coin-parts: some coin-candidates include only one of the doubtful electrons, some include two, others include three, and so on.

Now, each permissible sharpening must select as a coin exactly one candidate
from each cluster of coin-candidates. I intend to argue that this can only be
done if we allow that those sharpenings which select one candidate from each
cluster do so in an arbitrary fashion. I will then, in Section *3.2*,
suggest why such arbitrary sharpenings of ‘is a coin’ are unsatisfactory. As a
background assumption, we will pretend that classical physics accurately
describes the physical world; hence,we will assume that every physical thing stands
only in determinate spatiotemporal relationships with other physical things.
If, as I suspect, it turns out that ‘is a coin’ cannot be sharpened in a
principled way even under such generous conditions, then that is quite a
significant result.

The way that our ordinary predicate, ‘is a coin’, selects objects as coins
is principled. As I noted in Section *2*, coins are distinguished from
non-coins on the basis of certain common features, for instance, their
perceptible shapes, sizes and designs. Of course, the selection principles that
govern the application of our ordinary predicate ‘is a coin’ are not
fine-grained enough to distinguish between the many aggregates of particles
which are vying for the mantle of ‘the shilling that particularly caught your eye’—hence,
the Problem of the Many. The question I now want to address is whether there
could be adequate sharpenings of ‘is a coin’ that are principled. That is,
whether there could be adequate sharpenings according to which those aggregates
of particles that satisfy the (sharpened) predicate do so in virtue of some
common feature or features.

Think again of the shilling that has caught your eye. Consider a certain
sharpening, *Q*, of ‘is a coin’ that selects just one of the candidate
aggregates, *s*, as a shilling. And assume that this selection is not
arbitrary; there is some feature that *s* has and the other candidates
lack, which forms the basis for *s*’s selection. As I will now explain,
very exacting demands are placed on Sharpening *Q*.

Next to the shilling is a penny. *Q* selects *s* as a coin, while
rejecting many aggregates from the same cluster that differ from *s* by
only an particle or two. At the same time it also selects one candidate, *p*,
from the adjacent penny-cluster as a coin. Note that shillings and pennies are
quite different from each other. To the naked eye, their diameters vary
markedly and the designs on their faces are altogether different. Now, consider
a candidate to be the shilling, *s _{1}*, which overlaps

The almost exact similarity of *s* and *s _{1}*, and the
relative dissimilarity between

Not only is it the case that s is much, much more similar to *s _{1}*
than to

That the principle underlying Sharpening *Q* might not be immediately
obvious, does not yet establish that there is no principle. Let us now see what
the supervaulationist might be able to offer.

Since in our regular usage of ‘is a coin’ we distinguish coins from
non-coins in part on the basis of their shape-properties, we might consider
whether some sharpened notion of shape could underwrite *Q*. This
suggestion is quite obvious, but quite obviously has no chance of working.
Sharpening on the basis of shape would lead either to (a) not sharpening ‘is a
coin’ enough to eliminate all but one coin-candidate in each cluster, or (b)
making ‘is a coin’ so precise that one candidate from a certain cluster is
selected over candidates in the same cluster that differ from it by only a
particle or two, with the result that nothing else anywhere counts as a coin.

A more sensible thought might be that ‘is a coin’ could be sharpened on the
basis of density. There are a couple of things we might have in mind here. For
instance, Sharpening *Q* might pick out a certain density, and only the
coin-candidate from each cluster matching that density counts as a coin. Or, to
accommodate the fact that not all coins are, or need be, made out of materials
with similar densities (a coin might easily be made out of plastic, for
instance) you could complicate things. Instead of simple densities, you might
say that *Q* involves *ratios* of densities. *Q* says that a
candidate is a coin only if the ratio of the density deep in its interior to
the ratio at its periphery has a certain value. On the other hand, you might
try something different. For instance, you might try something based not on
density, but on the simpler property of distance: according to *Q*, *x*
is a coin iff it is a coin-candidate, and for every sub-atomic particle, *y*
that is a part of *x*, there is another sub-atomic particle that is a part
of *x* and is at a distance less than *d* from *y*.

There are grave problems for each of these approaches. Consider first the
density and ratio-based approaches. The most telling difficulty is that if the
density-value or ratio-value is not very sharply defined, then *Q* could
easily admit too many candidates as coins. On the other hand, if these values
are quite sharply defined, then *Q* might not say that there are enough
coins; some clusters might not include any candidates that match the value. The
distance-based approach also faces a difficulty of specification. For sensible
values of *d*, it does ensure that there is at least one coin per cluster,
but it doesn’t eliminate the possibility that there is more than one coin per
cluster. Moreover, it seems highly plausible that the problems of specification
besetting the approaches I have outlined here will generalise for other
quantities we might try to substitute for the ones mentioned here.

Perhaps, however, there is a means of circumventing the problems associated
with the approaches I have just considered. The thought is that we can make a
simple amendment to some of those approaches. We simply say, for example, that
x is a coin according to Sharpening *Q* iff *x* has the *greatest*
density of all the coin-candidates in its coin-cluster. For another sharpening,
it might be the third-greatest density that is relevant, or it might be the
smallest amount of some other quantity. It should be fairly clear that this
amendment is not going to work in the case of density. Since density is a ratio
of mass to volume, it follows that appeal to the greatest density (or the
second-greatest, or the smallest, etc.) is not going to remove the possibility
of ties. For instance, it would be quite possible for two coin-candidates in
the same cluster to share the greatest density in their cluster. What the
supervaluationist needs, then, is a simpler quantity—one that is not a ratio.
As good a quantity as any to try here is mass. But even a simple quantity like
mass provides us with only two feasible ways of sharpening ‘is a coin’, and as
I will now explain, this is not satisfactory.

Consider a certain coin-cluster. There is bound to be a coin-candidate with the smallest mass in this cluster. That aggregate is the sum of the particles definitely included in the coin. Likewise, there is bound to be a coin-candidate with the largest mass, namely, the sum of the particles definitely included in the coin and those particles which are neither definitely included nor excluded.

However, for any *n* between 1 and the ordinal number assigned to the
smallest mass, there will be no unique coin-candidate with the *n*th-greatest
mass. To illustrate this, consider whether there could be a unique
coin-candidate which has the second- greatest mass. To find the one that has
the second-greatest mass, we ‘take away’ from the coin-candidate with the
greatest mass one of the electrons that is neither a definite part, nor a
definite non-part, of the coin. But which one should we ‘take away’?
Corresponding to every electron that is a questionable part of the coin, there
is a coin-candidate that includes all of the other questionable electrons
except that one. And each of these coin-candidates is such that there is only
one coin-candidate that has a greater mass than it. We can see easily enough
that these considerations iterate. Suppose we wanted to find the coin-candidate
with the third-greatest mass. In that case, we would ‘take away’ two electrons.
But which two should we ‘take away’? (And so on.)

It seems, then, that the supervaluationist is left with only two ways of sharpening ‘is a coin’: this predicate can be sharpened in terms of having the greatest mass or having the smallest mass in its cluster of coin-candidates. Unfortunately, this turns out to be quite problematic. Since the supervaluationist gives the semantics of our ordinary imprecise notions in terms of the ways that they could be made more precise, it turns out that, quite contrary to appearances, our ordinary notion of coinhood tells us that there are only two coin-candidates in each cluster. The Problem of the Many turns out in fact to be the Problem of the Two! This consequence diminishes the Problem of the Many in a way that, I believe, would be utterly surprising to everyone. Consider an analogous situation regarding the general supervaluationist treatment of vagueness. Suppose it turned out that there were only two permissible sharpenings of ‘short man’. According to one of these sharpenings only men shorter than, or equal to, 181 centimetres in height are short, and according to the other the crucial figure is 183 centimetres. This would be equally unsatisfying, since it would render certain intuitively borderline cases (e.g. men of 180 centimetres) as definite cases of shortness, and other intuitively borderline cases as definite cases of tallness (e.g. men of 184 centimetres). If the supervaluationist wishes to offer a plausible solution to the Problem of the Many, then many more than two permissible sharpenings of ‘is a coin’ are needed.

Taken together, the considerations I have raised suggest that the thought of
*Q*’s being thoroughly principled in a plausible way is difficult to
believe; at the very least, the onus now rests on the supervaluationist to show
how *Q* might be appropriately principled. Moreover, this conclusion holds
for every other supposedly permissible sharpening of 'is a coin’, since *Q*
is not different in any relevant respect from those other sharpenings.

What is the alternative to thoroughly principled sharpenings? The only
alternative, it would seem, is to allow for a degree of arbitrariness in *Q*.
Let me now explain how this might work. Consider one of the failed attempts to
secure entirely principled sharpenings for *Q*, say, the distance-based
account. This account said that according to *Q*, *x* is a coin iff
it is a coin-candidate, and no sub-atomic particle that is a part of *x*
is at a greater distance than *d* from some other sub-atomic particle that
is part of *x*. Suppose that this account does manage to reject many of
the coin-candidates in each cluster. We regard *Q* as a function which
rejects any coin-candidate, *x*, that has as a part a sub-atomic particle
which is at a distance greater than *d* from all other sub-atomic
particles that are a part of *x*. For any clusters where the
distance-based account fails to eliminate all but one coin-candidate, *Q*
arbitrarily selects one candidate from the remainder as a coin. Some other
sharpenings will also use *d* as the value for the ‘first step’ of the
elimination procedure, while arbitrarily selecting different candidates than *Q*.
And still further sharpenings will use a value different from *d* for the
‘first step’.

If we allow ourselves the liberality of counting sharpenings that exhibit arbitrariness to count as permissible, then it is easy enough to see that, on the surface, we will get the answers we want. Every permissible sharpening will be such that there is just one shilling that has particularly caught your attention. In addition, every permissible sharpening will be such that there are exactly 2546 other coins in the exhibition.

However, such a liberal approach has never, to my knowledge, been advocated
for other supervaluational treatments of vague natural language predicates. An
important reason why no one has advocated such an approach involves what Kit
Fine has called *penumbral connections*.[10]
In Fine’s terminology, the positive extension of a vague predicate makes up the
*umbra* of that predicate, while its borderline cases constitute the *penumbra*.
Now, the meanings of certain predicates, like ‘short man’, for instance, appear
to impose logical connections between certain sentences. Given that Wayne is
shorter than Phil, the following material conditional seems to be imposed: if
Phil is short then Wayne is short. This connection is absolutely
uncontroversial if Phil and Wayne are both in the umbra of ‘short man’; both
antecedent and consequent are true, and so the conditional is true. But what do
we say if both Phil and Wayne are in the penumbra of ‘short man’? In that case,
both the antecedent and consequent are indeterminate and so the conditional is
indeterminate. The supervaluationist thinks we can’t be satisfied with this
result. Provided that Wayne is shorter than Phil, the meaning of ‘short man’
dictates that ‘If Phil is short then Wayne is short’ is true, even if both of
the men are penumbral cases. Thus, the supervaluationist thinks, there are
penumbral as well as umbral logical connections. That is, there are sentences
which are indeterminate because they have penumbral subjects, but which
nevertheless stand in logical relationships with other sentences, including
other indefinite sentences.

The desire to include penumbral as well as umbral connections lies at the heart of supervaluational approaches to vagueness.[11] The supervaluationist retains penumbral connections by the now familiar device of supervaluations. We say, for instance, that ‘If Phil is short then Wayne is short’ is super-true, since it is true on all permissible sharpenings of ‘short man’.

With respect to the interests of this paper, the most important thing to note about penumbral connections is the following. In order for a sentence expressing a penumbral connection to count as super-true, those sharpenings that conflict with the penumbral connection must count as impermissible; penumbral connections impose conditions that a sharpening must satisfy if it is to count as permissible. Thus, for instance, no sharpening of ‘short man’ which attributes shortness to Phil but not to Wayne is to count as permissible, since such a sharpening falsifies ‘If Phil is short then Wayne is short’. As I will now indicate, this sort of example shows why it is not acceptable to count sharpenings of ‘short man’ that are tinged with arbitrariness as permissible.

Let’s consider an example closely analogous to the situation where *Q*
is held to be a partially arbitrary sharpening of ‘is a coin’. Call *M* a
sharpening that shrinks the penumbra of ‘short man’ only so far in a principled
way. Ignoring complications arising from second-order vagueness, let’s
stipulate for the sake of argument that penumbral cases of ‘short man’ are
those men ranging in height from 175 to 182 centimetres. *M* closes that
gap in a principled way by counting men shorter than 177 centimetres as short,
and men taller than 180 centimetres as non-short. *M* counts all other men
as either short or non-short. In particular, Wayne, who is 178 centimetres in height,
is counted as non-short, while Phil, who measures 179 centimetres, is counted
as short.

It is true that *M* does, upon supervaluation, give us the right
results for every individual man concerning the question of whether he is
short, non-short or neither. And it does accommodate some penumbral
connections—for example, [Short]Wayne Ú Ø[Short]Wayne and Ø(
[Short]Wayne Ù
Ø[Short]Wayne
). But it does not honour *all* penumbral connections. Obviously, it
does not honour ‘If Phil is short then Wayne is short’. There are other
partially arbitrary sharpenings that do honour this connection, but many of
these do not honour a similar connection between Phil and another man in the
penumbra, Ed. And of those which honour both of these connections, there are
many that do not honour a similar connection between Ed and another man in the
penumbra, James. And so on. By the end of this process of elimination we are
left with just those sharpenings that do not look arbitrary in the slightest.
Each of these sharpenings has a tallest (or equal-tallest) short man, and is
such that every man who is shorter than the tallest short man is short, and
every man taller than the tallest short man is tall. Thus, it appears that the
only sharpenings of ‘is a short man’ that can honour all of the penumbral
connections are those that are thoroughly principled.

Now we are ready to return to the Problem of the Many. Earlier, I argued that a supervaluational solution to the Problem of the Many requires each sharpening of ‘is a coin’ to select, in a partially arbitrary way, one aggregate from each cluster of coin-candidates as a coin. I will now urge that a policy of allowing such sharpenings for this predicate falls foul of penumbral connections.

The argument is quite simple. In our ordinary discourse, coins are differentiated from non-coins in a principled way. We would never say, ‘In my hand is a coin and on my wrist is a watch, but there is no principled reason why one is a coin and the other is not’. The meaning of ‘is a coin’ dictates the following maxim:

*Non-Arbitrary Differences* (NAD):

For any coin and non-coin, there is a principled difference between them which forms the basis for one being a coin and the other being a non-coin.

*NAD* imposes penumbral connections that any permissible sharpening of
‘is a coin’ must satisfy. Imagine two aggregates of particles, *d* and *e*,
which both count as borderline coins. *NAD* imposes the following
penumbral connections on every permissible sharpening: if *d* is a coin
then so is *e* unless it differs from *d* in a principled way. And
likewise, if *e* is a coin then so is *d* unless it differs from *e*
in a principled way. A supervaluational solution to the Problem of the Many
that relies upon arbitrary sharpenings is bound to violate these connections.
And this means that *NAD* turns out to be super-false, which is thoroughly
unsatisfactory. What this suggests is that a supervaluational solution must
employ only entirely principled sharpenings. But as I argued in Section *3.1*,
this option is not available.

Before concluding this section, I would like to address a response to my use
of *NAD* which may occur to defenders of the supervaluational solution.
There is something interesting that holds for every arbitrary sharpening of
coinhood. For each arbitrary sharpening, every non-coin almost entirely
overlaps something else that is a coin. However, this feature is not shared by
the coins. There is no arbitrary sharpening according to which there is a coin
which overlaps something else that is a coin. Indeed, this difference points to
a general principle which many philosophers would undoubtedly wish to preserve,
namely, that if something of kind *x* mostly overlaps something else, then
that something else is not also an *x*.[12]

However, while the difference between coins and non-coins that I have just
outlined is indeed principled, this difference by itself gives us no
information about which things are coins and which are not; merely noting that
coins and non-coins differ in this way does not partition the world into coins
and non-coins.[13] And this
means that pointing to this difference does not allow the advocator of
arbitrary sharpenings to satisfy *NAD*. The requirement that this
difference be respected operates *at best* as a constraint upon the
admissibility of any principle/s that putatively separate coins from non-coins.

Even if my objection of the last paragraph is waived, there is another
principle that is inextricably linked to *NAD*. Just as there should be
certain principled differences between coins and non-coins, there should also be
certain principled ‘coin-making’ similarities between coins. Hence, we have *NAS*:

*Non-Arbitrary Similarities* (NAS):

For any pair of coins, there is a principled similarity between them which forms the basis for their both being coins.

This principle focuses only on coins, which means that the difference
between coins and non-coins mentioned in connection with *NAD* is not
relevant. *NAS* imposes its own penumbral connections. Consider again the
borderline coins *d* and *e*. If *d* and *e* are both coins
then there is a principled similarity between them which makes it the case that
they are both coins. Arbitrary sharpenings are bound to violate such
connections.

With respect to coins, I find it difficult to see how either of *NAD*
or *NAS* could be denied. *Perhaps* if ‘coin’ were a
family-resemblance term then *NAD* and *NAS* might come into
question. I doubt that ‘coin’ is in fact a family resemblance term, but the
arguments I have given apply to terms other than ‘coin’. So, if, for example,
it turns out that there are reasons for denying *NAD* and *NAS* that
centre around ‘coin’ being an artifact term, the arguments could be restated
using natural kind terms. For example, instead of concentrating on the problem
of the many coins in your pocket we might consider the Problem of the Many as
it applies to lumps of lead, modifying *NAD* and *NAS* accordingly.

I conclude that the supervaluational solution to the Problem of the Many is in trouble.

Now is a good time to point out that very few people have supported in print exactly the straight supervaluational solution that has so far been my focus in this paper. The only endorsement of the straight solution I can find in print is due to Mark Heller.[14] And even here, Heller does not regard the supervaluational solution as integral to his project (p. 111).

Despite the paucity of straight supervaluational solutions in the
literature, the view that I have been attacking here is no straw man. For one
thing, it is plausible to think that the straight supervaluational solution may
enjoy some support. And more importantly, the straight solution is an integral
component of solutions to the Problem of the Many that *have* been
proposed in print. Here, I am thinking in particular of solutions presented by
David Lewis and E.J. Lowe.

Lewis offers a contextual solution to the Problem of the Many.[15]
In most contexts, notably our less philosophical everyday thoughts and
communications when we are ignoring the many candidates to be the shilling that
has caught your attention at the exhibition, we favour an interpretation of
coinhood according to which there is *strictly* one such shilling.
(Lewis uses the example of cathood rather than coinhood.) In these contexts, he
thinks, the supervaluational procedure gives us the answer that we want, since
it is true on all permissible sharpenings of ‘is a coin’ that there is strictly
one shilling. However, during our more philosophical moments, when we
explicitly note that there are many equally deserving candidates to be the shilling,
context favours an interpretation of coinhood according to which there
genuinely are many shillings. Lewis hastens to add that in these contexts the
sense in which there is more than one shilling is benign. The various shillings
have almost all of their parts in common, so they are *almost*
identical.[16] There are many shillings, but they are
almost one. So it is harmless enough to approximate and say that there is one
shilling.

At this point you might want to ask whether we could put supervaluations aside and opt for a simpler non-contextual solution, allowing almost-identity to do all of the work. Lewis thinks we can’t do this because there are cases of the Problem of the Many that do not involve almost-identity:

Fred’s house taken as including the garage, and taken as not including the garage, have equal claim to be his house. The claim had better be good enough, else he has no house. So Fred has two houses. No! We’ve already seen how to solve this problem by the method of supervaluations ... [A]lthough the two house-candidates overlap very substantially, having all but the garage in common ... we cannot really say they’re almost identical. So likewise, we cannot say that the two houses are almost one.[17]

Lewis’ view is that in cases like Fred’s house, we can leave all of the work to supervaluations. Regardless of whether ‘is a house’ is sharpened so as to include or exclude Fred’s garage, it remains true that Fred has only one house.

Once we realise that supervaluations cannot contribute to a treatment of the Problem of the Many, Lewis is left without the resources to support a contextual solution. Moreover, without recourse to supervaluations, Lewis has nothing to offer us when it comes to dealing with cases like Fred’s house, where overlap between the house-candidates is extensive but not extensive enough for the ‘many but almost one’ solution to come into play. It is also worth noting that Lewis’ special dependence on supervaluations in certain cases of substantial, but not almost complete coincidence, is damaging to his contextual account even if any general objections to the use of supervaluations are waived. Given the terms in which he has set up his contextual account, it appears that he should admit to contexts in which Fred has two houses that are not almost one, namely, those contexts in which we are explicitly attending to the candidates to be Fred’s house.

E.J. Lowe’s solution, on the other hand, seems to give us the resources to
say that there are no conditions under which Fred has more than one house. Lowe
suggests that we solve the Problem of the Many by invoking the view that
identity is distinct from constitution. Concerning Fred’s house, Lowe would say
that Fred has one house and there are two structures, neither of which is *identical*
with the house, that have equal claim to *constitute* the house. ‘Fred’s
house’ is a precise expression, whereas ‘the constitutor of Fred’s house’ is a
vague designator and supervaluations are to be used to secure the conclusion
that Fred’s house has only one constitutor.[18]

We might try and extend my previous argument against the supervaluational
solution by observing that the distinction between constitution and identity
does not prevent the violation of principles that are closely related to *NAD*
and *NAS*, such as (*NAD**):

(*NAD**): For any coin-constitutor and non- coin-constitutor, there is
a principled difference between them which forms the basis for one’s
constituting a coin and the other’s failing to constitute a coin.

However, such a straightforward extension of the argument is, at the very
least, questionable. We might observe that ‘coin-constitutor’, unlike ‘coin’,
is a theoretical term of art whose meaning is not grounded in common usage. So
it may well be that a supervaluationist of Lowe’s ilk need not endorse (*NAD**).[19]

There are, however, other reasons to be concerned about Lowe’s solution. The view that there is a distinction between constitution and identity is supposed to be a piece of serious metaphysics. On that view, our ontology includes, in a serious sense, both coin-constitutors and coins. If ‘coin-constitutor’ is a vague designator then that is because the constitution relation is vague. And that amounts to ontological vagueness. Even if this objection is waived, it can be shown that the thesis that there is a problem of the many coin-constitutors, but no problem of the many coins, leads to serious difficulties.

If, as Lowe’s solution suggests, there is no problem of the many coins but only a problem of the many coin-constitutors, then there are things such that they are coins. Consider such a coin, and consider the following proposition:

*(L)* There is a largest exact region of space such that the coin fills
*that* region of space.[20]

Given that we are confronted here with a thoroughly semantic treatment of
the Problem of the Many, set against the backdrop of a semantic account of
vagueness in general (though on this second point observe the remarks at the
end of this section), we can stipulate that no objects have fuzzy boundaries,
and that, therefore, *(L)* is true. *(L)* is also super-true. Since
‘is a coin’ is precise, there is only one way of making ‘is a coin’ precise.
And since *(L)* is true according to the one way of making that predicate
precise, *(L)* is super-true. Now, consider the following propositions:

*(LC)* There is a largest exact region of space filled by the
coin-constitutor.

*(LC*)* There is largest exact region of space such that the
coin-constitutor fills *that* region of space.

*(LC)* is super-true, since for every candidate to constitute the coin
there is a largest region of space that the candidate fills. However, *(LC*)*
is super-indeterminate, since *just which* region counts as the largest
filled region varies from candidate to candidate; no particular region of space
is such that it satisfies *(LC*)* on more than one sharpening of
‘coin-constitutor’. Now, constitution theory says that a coin and its
constitutor fill the same regions of space. Thus, when set against the
background of constitution theory, *(LC*)* entails *(L)*. This means
that *(L)* counts as super-indeterminate. But we have already seen that *(L)*
is super-true. This looks like a severe difficulty.

Where could Lowe go from here? He might conclude that there is, after all,
in addition to the problem of the many coin-constitutors, a problem of the many
coins. But then, if he decides that he still wants to endorse a
supervaluational solution he will need to sharpen ‘is a coin’. And then he will
fall foul of *NAD*.

It is important to make one further comment about my argument against Lowe.
In ‘The Problem of the Many and the Vagueness of Constitution’, Lowe makes it
clear that he is endorsing a semantic solution to the Problem of the Many.[21]
Elsewhere, however, he has argued for the coherence of metaphysical vagueness
on the basis of a possible interpretation of quantum mechanics.[22] Lowe argues that there may well be vagueness
involving metaphysically indeterminate identities at the quantum level
resulting from the entanglement of quantum particles. Perhaps Lowe would like
to maintain that there is (or at least, might well be) metaphysical vagueness
at the quantum level, while still regarding the Problem of the Many as being
amenable to a semantic solution. Even so, the putative indeterminate identities
Lowe discusses involve only matters of identity over time (diachronic identity)
rather than identity at a time (synchronic identity).[23]
And insofar as metaphysical vagueness is restricted to diachronic identity,
nothing in the cases Lowe presents casts doubt on any of the principles, such
as *(L)*, which I have used in arguing against Lowe’s semantic solution,
since each of these pertains exclusively to synchronic matters.

I believe that my argument against the supervaluational solution to the Problem of the Many has negative implications not only for the supervaluational treatment, but also for the solution to the problem which falls out of the view that vagueness is a purely epistemic phenomenon.

According to those who support the epistemic theory of vagueness, such as
Timothy Williamson and Roy Sorenson, vagueness is to be located neither in the
world, nor in our concepts.[24]
Instead, it is to be located in our ignorance. On this view, there is always a
fact of the matter as to whether a given concept applies to a given object.
So-called borderline cases of application are cases where our knowledge of
which concepts apply to which things fails us. Thus, the epistemic theorist’s
solution to the Problem of the Many is to say that we do always pick out one
aggregate of particles when we say things like, ‘*The* coin is
such-and-such’.[25] Our
concepts are perfectly sharp but our epistemic arrogance keeps us from
recognising this truth.

Now, we have already seen what it takes for an ordinary physical kind notion
to be sharp. It takes a measure of arbitrariness with respect to the matter of
which things are instances of that notion and which things are not. So if our
concepts are sharp, as the epistemic theorist has it, then it is arbitrary that
some aggregates of particles are coins while others which differ only very
minutely, are not. Thus, it turns out that if the epistemic theorist is
followed, *NAD* and *NAS* are violated. In addition, the epistemic
theorist needs to accept some sort of magical theory of reference. A final
scene from the exhibition will illustrate this.

By the end of the day you are so enamoured by the shilling in the exhibition that it falls into your pocket and acquires the name, ‘Bessie’. As you croon softly to her, your voice reaches out across the short distance between you and in a way that is quite inexplicable, bestows your affection on just one aggregate of particles.[26]

*Monash University*

[1] Unger,
Peter, ‘The Problem of the Many’, *Midwest Studies in Philosophy*, 5,
1980, pp. 411–67.

[2] Unger, ‘The Problem of the Many’, p. 412.

[3] Lewis, David,
‘Many but Almost One’, in John Bacon *et. al.*(eds.), *Ontology,
Causality and Mind*, New York: Cambridge, 1993, pp. 23–42, at p. 29.

[4] This is a simplification. There are, of course, other factors involved, such as matters relating to causal history. Not every object with the requisite physical properties for coinhood must count as a coin. For our purposes, though, this complication can be ignored.

[5] How much overlap does this principle permit in the case of coins? Whatever the answer is here, the degree of overlap that comes into play when considering the problem of the many coins in your pocket is surely great enough to count as significant.

[6] In fact, on Russell's account of descriptions, the second sentence analyses the former.

[7] Unger, ‘The Problem of the Many’, p. 415.

[8] Tye,
Michael, ‘Fuzzy Realism and the Problem of the Many’, *Philosophical Studies*,
81, 1996, pp. 215–225, at p. 221.

[9] Lewis, ‘Many but Almost One’, p. 23.

[10] Fine, Kit,
‘Vagueness, Truth and Logic’, *Synthése*, 30, pp. 265–300, at p. 270.

[11] Fine, Kit, ‘Vagueness, Truth and Logic’, pp. 269–71.

[12] To be more accurate, many philosophers would like to preserve this principle for ordinary physical object kinds. There may be things for which this principle fails, for instance, mereological sums (if we suppose that composition is unrestricted).

[13] *Cf.**Unger, ‘The Problem of the Many’, Section 10**.*

[14] Heller,
Mark, *The Ontology of Physical Objects: Four Dimensional Hunks of Matter*,
Cambridge, 1990, pp. 151–4.

[15] Lewis, ‘Many but Almost One’, pp. 34–5.

[16] Lewis
borrows the notion of almost-identity from David Armstrong. See Armstrong,
‘Reply to Lewis’, in Bacon *et. al.*, *op. cit*., pp. 38–42.

[17] Lewis,
‘Many but Almost One’, pp. 35-6. See
also Johnston, Mark, ‘Constitution is not Identity’, *Mind*, 101, pp. 89–105,
at p. 101*f*.

[18] Lowe, E.J.,
‘The Problem of the Many and the Vagueness of Constitution’, *Analysis*,
55, pp. 179–82, at p. 180.

[19] Thanks here to an anonymous referee.

[20] There are many regions of space filled by the coin. The largest region filled by the coin is the region that, intuitively speaking, marks the coin’s spatial extension.

[21] Lowe, ‘The Problem of the Many and the Vagueness of Constitution’, p. 180.

[22] See, for
instance, his ‘Vague Identity and Quantum Indeterminacy’, *Analysis*, 54,
pp. 110–14 and also his ‘Ontic Indeterminacy of Identity Unscathed’, *Analysis*,
61, pp. 241–5.

[23] Lowe, ‘Ontic Indeterminacy of Identity Unscathed’, p. 243.

[24] Sorenson,
Roy, ‘Sharp Boundaries for Blobs’, *Philosophical Studies*, 91, 1998, pp.
275–95, and Williamson, Timothy, *Vagueness*, London: Routledge, 1994.

[25] Sorenson, ‘Sharp Boundaries for Blobs’, pp. 292-3.

[26] Thanks to Sam Butchart, Lloyd Humberstone, Brian Weatherson and two anonymous referees for helpful comments. Special thanks to John Bigelow for very useful and encouraging discussions.