Log In   or  Sign Up for Free

Philosophy Discussion Forums | A Humans-Only Club for Open-Minded Discussion & Debate

Humans-Only Club for Discussion & Debate

A one-of-a-kind oasis of intelligent, in-depth, productive, civil debate.

Topics are uncensored, meaning even extremely controversial viewpoints can be presented and argued for, but our Forum Rules strictly require all posters to stay on-topic and never engage in ad hominems or personal attacks.


Use this forum to discuss the philosophy of science. Philosophy of science deals with the assumptions, foundations, and implications of science.
#65085
I've used complex numbers a bit (in calculus mostly) and also delve into number theory from time to time. Currently, while I think I understand math at a decent level, I'm having problems with accepting complex numbers. I know we probably won't ever get rid of them and I should just take our understanding on math as is, but thinking over things relatively made me question it.

The basic problems are with roots which all even roots can break break down to square root and a square root of -1 can be pulled out to leave us with the imaginary number which leads to complex numbers. While I have no problem with this in the way we view math currently, I'm wondering if we are viewing math and this fundamental thing correctly.

Starting with powers, if you plot the integer powers one by one, you will see that even numbers look vastly different than odd ones. That's been a problem in my mind, but I'm having problems with how to voice it. I'll try the best I can with this.

When adding and subtracting, we do everything from an origin, and we label that origin "0". With relativity, where that zero is can be wherever we want to place it. The term "3" is really 3 units in a certain direction (say up), and "-3" is 3 units in the opposite direction (say down). Granted, I understand it's about more than direction, but put that aside for a little. When we add "3+3", we are going 3 units up from the 3 on the line (3 away from the origin), when we subtract "3-3" we are going 3 units down from that 3. Same with "-3+3". Granted, I understand that it needs to be interchangeable and they are both amounts that can be interchanged, but things get more complicated.

When we are multiplying, we are moving a certain number a certain number of times from a number. "3*3" can be seen (can it?) as moving 3 units 3 times from the origin going up: "3+3+3". "(-3)*3" can be seen as moving 3 units 3 times from the origin going down. So what about "(-3)*(-3)", this is where it starts getting hairy for me. We are moving 3 units 3 times from the origin going down and flipping direction"-(3*3) then -(-(3*3))". So "3*(-3)" would be moving 3 units 3 times going up and flipping direction "+(3*3) then -(+(3*3))". Both of the second negatives could be seen as outside operators rather than on the numbers themselves (more like "-((-3)*3)" ). Yes, I know it sounds weird, but there is a reason that I think that the second negative should be treated differently. Let me try to explain this first before going into powers.

When I multiply something real with something, I usually state it like "3 dogs times 2 equals 2 dogs times 3". The second numbers are not dogs, but amounts even though the first numbers are amounts of the something. So "3 dogs times 2 dogs" seems kind of odd to me. Granted, I guess (3 times 2) dogs, might sound ok, but I would usually associate one of the numbers with the real thing and the other as the multiplier. The other thing I wanted to bring up, was exponentiation. I'm not going into it just yet, but wanted to show a little something of them. "3^(-2)" does not change the sign of the base, but "3*(-2)" does. I'm not saying it shouldn't, but that the negative on the power number means something different than the negative on the base. Why is that for powers, but not for multiplication?

Ok, so going more into exponentiation, can "3^3" be seen as moving 3 units 3 times for 3 times from the origin going up? "+(3*3*3)" or "+(3+3+3 + 3+3+3 + 3+3+3)"... Can "-3^3" be seen as moving 3 units 3 times for three times from the origin going down? "-(3*3*3)" or "-(3+3+3 + 3+3+3 + 3+3+3)"? Now you can probably see a little more clearly what I'm proposing. This would mean that "(-2)^2" would equal "-(2^2)" which could have serious implications that I'm not quite seeing yet. Maybe I'm hoping somebody will point these out.

Does this make any sense to anyone else? Are my troubles misplaced and I should just keep dealing with imaginary numbers (because they probably aren't going away either way)? Can someone please either help me to clarify this or put me back in line, because this is bothering me currently (well, has been for a while)?
#65111
I had the same sort of problem in when I took abstract algebra. This isn't an answer you want to hear, but the bridge to higher mathematics is to put aside all images of concrete thinking .

In abstract algebra I thought I failed the first test (strangely I actually got an A). Since I didn't understand anything, I went to the prof. to tell him I was dropping the course. He tried to get me to stay by giving me the advice not to try to get to the bottom of anything -- just simply manipulate the symbols by following the rules.

And if I could do that, he said, I would do well, and then after that, my ability to do abstract thinking without visualization would be come naturally. And this he said, is how higher math is done. (For a short summary of this kind of advice, here's a link to Abstraction in Mathematics).

Well, I tried his advice for another week or so and changed my major to chemistry.
#65130
dowhat1can wrote:I had the same sort of problem in when I took abstract algebra. This isn't an answer you want to hear, but the bridge to higher mathematics is to put aside all images of concrete thinking .

In abstract algebra I thought I failed the first test (strangely I actually got an A). Since I didn't understand anything, I went to the prof. to tell him I was dropping the course. He tried to get me to stay by giving me the advice not to try to get to the bottom of anything -- just simply manipulate the symbols by following the rules.

And if I could do that, he said, I would do well, and then after that, my ability to do abstract thinking without visualization would be come naturally. And this he said, is how higher math is done. (For a short summary of this kind of advice, here's a link to Abstraction in Mathematics).

Well, I tried his advice for another week or so and changed my major to chemistry.
I'm really good at following rules, and tend to do very well following the rules. I disagree with the logic at that point though.

I guess the biggest point of contention is with the exponent. We treat multiplication much the same as addition, and put the same properties (like the commutative property x*y=y*x), but while I can see why it's done this way, I disagree. The exponent function is more like the next step up from multiplication and we don't treat it like multiplication or addition "x^y =/= y^x". What I'm trying to point out in trying to settle this again in my head is that much like "x^y" where the "y" is not treated the same as the number directly operating on "x" rather telling what is to be done with "x", "x*y" seems like it should be kind of the same or working toward it. When there is a negative on the exponent, it's treated as an indication of operation, not as a negation. Basically "x*y =/= y*x". Even in addition the negations are already different. "x-y" is x moved in the opposite direction of whatever up is for "y" number of times. We can commune the negative to put "y" on the other side of the number line and it comes to the same number "x+(-y)". However, when we take a step toward another operator that increases the amount added for simplicity, the negation of the second number starts becoming a problem. I guess the implications are the difficulty involved with it, but surely this was brought up before. Granted, it was probably brought up and discarded, but it makes sense logically (well, in my head). I'd like to see the notes if it was brought up to see the basic arguments, but can't really find if/when it was or who might have brought it up, because they may have just been laughed at. I'm fine with being laughed at, and want to bring it up again. I'll keep looking to see if I find someone that may have brought it up.
#65145
Craniumonempty I think it's really cool that you are asking questions about math in this way. I admit that my first reaction was, "Well you can't think about exponential notation in that way; the 'multiplication' in powers is not communicative in terms of the meaning of the expression." But if a prof. gave me an answer like that, I'd be ticked off.

The Greeks of course thought in terms of geometry so a was thought of in terms of a line, a^2 in terms of a surface, and a^3 in terms of a solid. How they thought of a^4 is beyond me.

I remember Descartes was the first to think of powers in terms of the proportions of lines, so 1 : a :: a : a^2, and so on, where 1 was any arbitrary length, but I can't remember how he graphed this expression.

If I can get so time off this afternoon I'll see if I can't find someone in math history who tried to reason through powers in the ways you suggest. Seems like there might have been--but it's been a few years since I've read anything in math history.
User avatar
By Craniumonempty
#65159
dowhat1can wrote:Craniumonempty I think it's really cool that you are asking questions about math in this way. I admit that my first reaction was, "Well you can't think about exponential notation in that way; the 'multiplication' in powers is not communicative in terms of the meaning of the expression." But if a prof. gave me an answer like that, I'd be ticked off.
Well, if this ever gets any real foothold, I'll probably be swinging from a tree. Figuratively, of course. I don't think many will like this idea, but I don't think math is supposed to be about liking it, rather about... well, math. However, my mind seems to think there is something more consistent with this, even though I have a hard time expressing it.

I've worked with these expressions many times in programming and have had to rip them apart and put them back together to make things work. With addition/subtraction, they seem to indicate direction of the numbers. Kind of like the negative themselves. It's more like a direction rather than something less than nothing. Like if I owe you money that's a negative for me, but it's a positive for you. That's why I brought up relativity, even though it was more an attempt to show the thinking behind it.

So, all numbers are basically uh, positive numbers, and the negative is an indicator of direction from a selected origin. In a way, +/- infinity is really just one big infinity with no origin, and we just put one there for ease. I guess in a way, this could work for addition as well.

Ok, this is me trying to solidify the idea a little more.

"1+2" is really a point at 1 from the origin moving (I'll use "up" again to indicate a direction) up 2 units to arrive at a position 3 from the origin. "1-2" is a point at 1 from the origin moving down 2 units to arrive on the other side of the origin, so we indicate it being on the other side of the origin by saying "-1". The negative is just telling us where it's located from the origin that we picked out. "1+(-2)" and other properties still work, but negative on the second number we are putting to give us ease of usage for the equation and it is still indicating direction of movement (until we switch it with the 1). The "1" in all of the equations are the initial point, and the other side of the "+" or "-" is the movement and which direction. All other properties that we have derived from this I would still say are correct.

The same goes for multiplication, but it gets more complicated because now we aren't really indicating direction of movement with multiplication itself (but we can and do). "1*2" is more like a unit of 1 now rather than a point (this thought of treating it like a unit instead of a point just popped into my head, so haven't really worked it all the way out yet) that is stacking twice. One unit is one point out and moving in another unit to double it: "1+1". Yeah, maybe the unit idea is a little much. I'll work on that later. Either way, "1*3" we are moving out the same unit 3 times "1+1+1". "1*-3" now seems like two different operations. We are starting with one unit, and we are moving the unit 3 times, then we are changing direction. Basically it's an operation more like "-(1*3)". The reason "1*-3" doesn't work in my brain (well, it does, but only if I just do the math without thinking of the conflicts that I currently see) is because "-3" is still three units, and direction doesn't really matter. So, "1*-3" is really how we would currently write "1*|-3|" since 3 is still three units, with an extra operation changing direction of the original unit "-(1*|-3|)". But this is still fine, because all of the other properties can still be derived from this (I think). The problem is then we get more complicated. Basically, with this thinking "-3*-3" where the second negative is not a separate operation is really how we would now say "-3*|-3|" or "-9", because multiplication doesn't indicate itself indicate direction.

This brings us to the powers. Using this same thinking, it's probably easier to see why I would say "-2^2" is equivalent to "-(2^2)" because multiplication doesn't indicate direction and so powers wouldn't either as an extension of it. So even if I were to say "-2^2 = -2*-2", the second "-" would not be an operator and could be dismissed so the formula would be shown in current understanding as "-2*|-2|" or really "2*2" in the same direction as the original number indicated by "-".

I know that's not a proof (or even close to one), but hopefully it clears it up a little bit more.
dowhat1can wrote:The Greeks of course thought in terms of geometry so a was thought of in terms of a line, a^2 in terms of a surface, and a^3 in terms of a solid. How they thought of a^4 is beyond me.

I remember Descartes was the first to think of powers in terms of the proportions of lines, so 1 : a :: a : a^2, and so on, where 1 was any arbitrary length, but I can't remember how he graphed this expression.

If I can get so time off this afternoon I'll see if I can't find someone in math history who tried to reason through powers in the ways you suggest. Seems like there might have been--but it's been a few years since I've read anything in math history.
Cool, I'm still looking to see if there was anything recorded. I've started looking into some current math, because I think there are some things that use non-commutative multiplication. What's funny to me is that I think I found them while looking reading up on abstract algebra.

EDIT: Say you owe me $40,000, which is negative on my books and positive on your books. If I tell you to give me the square root of what you owe me and we'll call it even, you can do the math on your end and get $200. However, if we are working out the problem as it is now, it's "($40,000)^(1/2)=$200" you can give me according to your books, but on my books it's "(-$40,000)^(1/2)=$200*i". So, what do you owe me? Is "$200*i" really "-$200"?
#65172
I'm trying, as the English say, to get my head around the problem to see exactly what's as issue.

Putting aside imaginary numbers for now, are you asking specifically why the following inequality is true per se?

a^b^c = a^(b^c) these two are not equal to these three (a^b)^c = a^(b*c) = a^b*c

(BBCode doesn't do inequality, so far as I can tell)

Or is this question not at issue?

Thanks.
#65177
dowhat1can wrote:I'm trying, as the English say, to get my head around the problem to see exactly what's as issue.

Putting aside imaginary numbers for now, are you asking specifically why the following inequality is true per se?

a^b^c = a^(b^c) these two are not equal to these three (a^b)^c = a^(b*c) = a^b*c

(BBCode doesn't do inequality, so far as I can tell)

Or is this question not at issue?

Thanks.
For inequality, I've been using =/= , but <> works also. Either way, what you stated is not the issue. The issue really does have to do with imaginary numbers, and the root of the problem that leads up to them. Basically the issue is more that this is not true: "(-2)^2 = -(2^2) = -4". This is currently "(-2)^2 = 4 =/= -(2^2) = -4". That's the basics of the entire problem without going into detail. That is where imaginary numbers come from. I'm also trying to show (probably very badly), that current multiplication is not 'wrong', but how it's thought of leading up to exponential functions is what causes the problem. The negative in the second number is another operation. That's pretty much what the problem is in a nutshell.
#65178
I suppose that your problem stems from the perceived need to find a (Euclidian) geometric rationale for the rules of arithmetic. There is quite a number of interesting books on the subject of how humanity has progressed in the field of mathematics - specifically when trying to go beyond the natural numbers N=(1,2,3...). The leap to negative integers, zero, the rationals, the reals and the complex numbers is definitely interesting reading. In examining your previous posts it seems likely that you have arrived at some of the stumbling blocks that the Sumeric, Greek, Indian and Arabic (as well as more near contemporary) mathematicians had to deal with in times past.

Unfortunately our inborn affinity with 3d Euclidian space does not prepare us for even a small fraction of the possible coherent systems of transformations that mathematics present us with. The field is necessarily abstract in nature.

Luckily there exists a remarkable body of mathematical texts which can help us to move forward in understanding. I would like to recommend "The Story of Mathematics" by Richard Mankiewicz, in which I think you will find some affinity with the problems that mathematicians in ages gone by have had to solve.

It is always interesting to read in-depth and serious attempts to get to the bottom of mathematical conundrums, such as you have presented. I have spent some 5 years between 1995 and 2000 teaching mathematics to college level students (yrs 10-12) and I have had many profound moments where I have had to really think about the subject: like, why is the sum of angles in a Euclidian triangle equal to 2*pi, or why is the area law for right triangles (a^2 + b^2 = c^2) true?

At all times it is true that serious thought about the foundations of what we think is the basis of our reality is something to be cherished.

Regards
#65193
T0rp wrote:I suppose that your problem stems from the perceived need to find a (Euclidian) geometric rationale for the rules of arithmetic. There is quite a number of interesting books on the subject of how humanity has progressed in the field of mathematics - specifically when trying to go beyond the natural numbers N=(1,2,3...). The leap to negative integers, zero, the rationals, the reals and the complex numbers is definitely interesting reading. In examining your previous posts it seems likely that you have arrived at some of the stumbling blocks that the Sumeric, Greek, Indian and Arabic (as well as more near contemporary) mathematicians had to deal with in times past.

Unfortunately our inborn affinity with 3d Euclidian space does not prepare us for even a small fraction of the possible coherent systems of transformations that mathematics present us with. The field is necessarily abstract in nature.
I have no problem with imaginary numbers in usage and understanding, but they still don't sit well in my head. I'm constantly delving into number theory as a hobby, and tend to also have to deal with basics all the time with breaking apart formulas in programming. Even then, I don't really have to use complex numbers, but I do get into looking at basics from every angle I can to make it do the same thing in a different way. Breaking apart multiplication was more of what brought me to this thinking. Well, it was probably more division, since I've spent way more time on that. Most of what I do with that is pure variables, so not really directly with the numbers until the program is running. Granted it's very definite, and I don't have to deal with anything too complicated directly with math though it plays a vital role.

I'm not saying that I'm correct either, and I know this (even if it doesn't sound like it). However, I would like to pursue the idea more, because it seems to fit.
T0rp wrote:Luckily there exists a remarkable body of mathematical texts which can help us to move forward in understanding. I would like to recommend "The Story of Mathematics" by Richard Mankiewicz, in which I think you will find some affinity with the problems that mathematicians in ages gone by have had to solve.

It is always interesting to read in-depth and serious attempts to get to the bottom of mathematical conundrums, such as you have presented. I have spent some 5 years between 1995 and 2000 teaching mathematics to college level students (yrs 10-12) and I have had many profound moments where I have had to really think about the subject: like, why is the sum of angles in a Euclidian triangle equal to 2*pi, or why is the area law for right triangles (a^2 + b^2 = c^2) true?

At all times it is true that serious thought about the foundations of what we think is the basis of our reality is something to be cherished.

Regards
I may have to wait on that as I'm not in the states or near any libraries that carry it... Well, in English. I'll try to get my hands on it to read it though. I've probably heard a lot of it, but forget.

Either way, which axioms are we using currently that deal directly with addition, multiplication, and powers? Should I study abelian groups or is there something more basic that I can work through? There are so many types of math that I get lost at times and I'm sure I could probably even find something to fit this idea if I wanted. I guess I just want what are the axioms for the very basics that we use in daily life using real numbers.

I've been under the impression that multiplication is a grouping of addition:

a*n = a+a+...+a } n times

and power was grouping of multiplication:

a^n = a*a*...*a } n times

I'm guessing it's much the same, but there's probably a lot I'm missing (I only saw the power written as such when I was browsing around).

I'm saying that there isn't much difference between the two, and we are making multiplication fit (the same properties as addition?) by adding another operation. Mostly because we have done it that way for thousands of years. We built powers on top of that. You are correct though, I'm probably thinking of a problem that has been thought of many times before. I'm not even saying the system needs to be thrown out. I don't think complex numbers are going anywhere. However, maybe a new system? The problem is that I don't have background (and memory) to back it up or build it. Most is going by what I see when I break apart the basics over and over again to rebuild them.

So basically, when I see this "a*-n" I don't see adding "a" for "-n" times, I see adding "a" for "n" times with a second operation that changes the sign of the entire function: "-(a*n)" (well, you may have heard this quite a lot already if you read my other posts). And we carry that operation with "n" to make multiplication communicative, because we have done it for a long time, it works to the same thing, and it is much easier to work with. I have no problem with that, except I disagree that what seems like an extra operation is carried to powers.

I probably do need to study more into the axioms. I don't suppose you know of an online reference that goes into the thinking behind them until I can get my hands on the book, do you? I use the Wolfram site all the time (especially when I need to differentiate or get an integral or something), but I think they just list them. I want to know the thinking on why they are self-evident, because the problem I have is because of axioms (I think).

Oh, and thank you for the reply. Even if you don't have time to respond again, it will probably help.
#65255
If you don't mind a bit of amateurish speculation, I like to try to view one of the questions you raise originating from the metaphysics of math. I hope you find the historical reasoning below worth wading through because it reflects a perspective on the nature of problem.

I conclude that the problem involving i (the bookkeeping problem) comes about, in philosophical terms, by viewing mathematics ontologically and referentially rather than formally.

When something is viewed ontologically, problems of existence can take the form of Heidegger's question, "What is the nothing?" which is a confused way of putting the question because, so to speak, "The nothing "isn't" anything."

So the problem of reference stems from the problem of existential import of how we refer to something that does not exist without somehow assuming that it does. (And from this, many other problems of self-reference can be derived, but that's a separate question).

I pulled out my old math history book (Burton, The History of Mathematics, 1998) which notes that the Babylonians knew and worked with quadratic equations of the form
  • x^2 + ax = b and x^2 = ax + b
but the negative square root was always ignored in the tablets because the solutions were only viewed geometrically.

Later, quadratic equations were discussed by a-Khowarizmi (820 CE) -- whose works were translated into Latin in the 12th century. He also only admitted positive coefficients and did not accept negative quantities. He solved the same equations as the Babylonians and also this form:
  • x^2 + b = ax
His advance was that he no longer limited himself to the geometric tricks of the Babylonians but instead calculated all of the standard types of quadratics in accordance with a set of rules, i.e., used an algorithmic approach (as your posts do).*

When his work entered Europe, through Spain, the negative solutions were not mentioned.

The work of other Arabs recognized both positive and negative solutions, but they were rejected because in Burton's words, "They did not perceive the reality of negative solutions ... The very idea of a negative root implies the acknowledgement of negative numbers as independent entities having the same mathematical status as positive ones. (emphasis mine)"(Burton, 185).**

Fibonacci(1200 CE) got the two different values from quadratics, but rejected the negative values as proper solutions. He got the bookkeeping right however when he interpreted a negative number in a financial problem to mean a loss instead of a gain (Burton, 274).

Finally, Cardan (1500 CE) made use of negative roots on occasion; he called them "fictitious;" he wrote, "So progresses arithmetic subtlety the end of which as is said is as refined as it is useless." (Burton, 310).

So from the point of view of the history of math, mathematicians were struggling with the question of distinguishing between the nonexistence of a number, the negation of a number, and, what Napier termed "ghosts of real numbers'.
Craniumonempty wrote:Say you owe me $40,000, which is negative on my books and positive on your books. If I tell you to give me the square root of what you owe me and we'll call it even, you can do the math on your end and get $200. However, if we are working out the problem as it is now, it's "($40,000)^(1/2)=$200" you can give me according to your books, but on my books it's "(-$40,000)^(1/2)=$200*i". So, what do you owe me? Is "$200*i" really "-$200"?
In the bookkeeping example, the problem of ]i]ontological[/i] reference would be explained by noting the referential difference of "existence to you" and "existence to me." The entry on your books ought not be the square root of the money lacking (i.e., the square root of what doesn't exist), but instead it is the square root of what does exist, namely the existence of "my money" which is owed to you. I don't owe you the "absence" of your money (an absence is not something that exists), I owe you the "presence" of my money (something that does).

In other words, it's not your loss that is owed to you--its what I gained which is owed. The confusion comes about through an ontological shift of reference.

So from a philosophical point of view, I see this problem as part of a cluster of problems similar to the simpler referential equivocation (which in this case does not involve the reflection of self-reference):
  • [Joe] Do you want to come to my party?
    [Sam] If my wife does.
    [Joe] I'm asking what you want?
    [Sam] I want whatever she says.
---
*The Latin corruption of his name is algorism which is translated in English as algorithm.
**Burton says the validity of negative numbers were accepted in India by Aryabhata (500 CE) and by Bhaskara (1000 CE), but were only first accepted in Europe in the 16th century.
#65323
I think you may be correct in thinking I'm looking at it algorithmically.
dowhat1can wrote: I don't owe you the "absence" of your money (an absence is not something that exists), I owe you the "presence" of my money (something that does).
But the absence does exist on my end. I think they should be interchangeable...

Beyond that, something in your other post and looking at group or maybe it was set theory... or one of them, triggered something. I think I found a way to build a mathematical system that can actually sit underneath the current one without changing anything and adding a few things (with exception of a few operators, but I'm working on that too). I'm trying to figure out how to spell it out without entirely rebuilding the number system, but it's difficult to start from the middle. The problem is that I'm too lazy to start from the beginning nor do I think I'll be able to ever finish it if I do. On top of that, it won't really be formal, because I would take much longer researching how to do it formally as just doing it.

Either way, I'm still working on a condensed version so that I can hopefully paste here. I want to get at least to a point to show that everything can be built above it. I've even found a way to leave in complex numbers. So basically, I'm not simplifying anything, rather making it more complicated, but I sort of want to see how it turns out.

When I started with this, I was thinking that the extra operation was a mistake, but I'm hoping to show that it's can be seen as a branch in operations. Maybe what I'm building will be seen as useless, but I think at least some of it is probably interesting if someone didn't already think of it. I haven't really seen it, but math is a huge domain. I don't have my hopes up of coming up with something new or anything, but am interested in spelling out not only what I was originally thinking, but also how I was mistaken in thinking that that original thought would replace complex numbers, rather they exist as a branch in that operation.... Well, now I'm repeating myself. I'm going to get some sleep, I've been trying to figure out how to put things together coherently (like I ever do anything coherently that's not written in code) and condensed.

Whether or not I give up on it, I'll paste what I've worked on here anyway so you can see where the thinking was going. This might take me a while though.
#65359
Craniumoempty wrote:But the absence does exist ...
Isn't this like saying the nonexistence of the money exists?

Yes, it's money owed, but the existence of the money is not something you have. And I don't think we can say that you "have" anything, much less, the nonexistence of the existence of the amount owed.

Yet, I do think you are correct that problems like this can arise even if we don't talk in terms of numbers existing or not existing. (So I'm thinking now that the construction of an (existential) practical example like the bookkeeping example, obfuscates the real question you are raising.)

I was reading Frege's philosophy of mathematics this summer and was dumbfounded at his brilliance of making something out of nothing time and time again through the formulation of new notation.

After reading a couple of math history books about ten years ago, I couldn't help but think the way discovery in math often occurs is exactly the way you are approaching these problems (i.e., intuitively) and is far more likely to be fruitful than looking at foundations and axioms. Once the ideas take shape the foundations and the proofs can come later. If you were to go the books now I suspect that you might look at the brilliance of what has already been done and get discouraged. Innovation in math comes early in life not just because of the peak of thinking, but because many twenty-somethings don't know what older mathematicians "know" cannot be done and aren't mislead by what already has been done ... and do it anyway.

Time and time again in history, current notation gets in the way of understanding the concepts.

You might try some sort of mind map of different ways to represent different kinds of numbers. (E.g.., stuff like see what happens if you were to graph ordinary and imaginary numbers with normal x and y axes and i values on the z axis, or try graphing imaginary numbers in non-Euclidean geometries. What do imaginary numbers look like from the point of view of set theory (i.e., arbitrary closure of the set of real numbers and the set of imaginary numbers being subsets of the set of complex numbers) ? and so forth. I suspect there are five or six different ways to look at power functions and logarithms -- mind map those and ideas and approaches might suggest themselves from unexpected correlations and relationships.

I sure wish I was good at R. It would be fun to play with some of the these ideas graphically.
#65502
dowhat1can wrote:
Craniumoempty wrote:But the absence does exist ...
Isn't this like saying the nonexistence of the money exists?

Yes, it's money owed, but the existence of the money is not something you have. And I don't think we can say that you "have" anything, much less, the nonexistence of the existence of the amount owed.
I agree with you in a way, and will try to address it in the "paper" that I'm writing. It's hard to say how I disagree with this statement, because in a way I do agree with it, but I think there is more to the problem then the physical presence of money.

Where I disagree is this statement: "the existence of the money is not something you have." If this statement were true, then I would completely agree with you. The problem arises in the linking of physically having something and knowing what that something is. If I had no distinct knowledge of what money was, then I couldn't say I had it, nor could I say I didn't have it. The fact is that I do have that distinct knowledge of what money is or how we view it at this time. So I can say when I have it and when I don't. That's not to say that it does or doesn't exist, because it already does exist distinctly in my mind. We probably even agree on the definition as it's something common in our society.

Once the distinction is made, then that is what I have of money and I can tell whether or not I actually have money. In a way this brings up two forms of "0" for money (again, I'll put this in what I'm writing). There's the initial "0" where I can't tell money from everything else. The distinction doesn't exist. I can't say it's lacking, I can't say it's there. I can't even say it doesn't exist, because I don't know about it. This is the term I called meta-zero. I'm trying to think of another term for it, but it's not really important what it's called. The first thing and or number or whatever you want to call it that comes isn't "0" as we know it, rather it's "1". The ability to distinguish it from nothing and/or everything else (the meta-zero). Once I know the concept, then I can say it exists because I can distinguish it. Here it is, and I have "1" of it. From this concept that was created in my mind, I can go up or down. This has to do with the way I distinguish this something.

The reason how we distinguish becomes important now is because to say there is another "1" of something, I have to know what links it to that other something, because that other something is different than the original "1" or else it would either be that "1" or I wouldn't know about it. So we have to know what "characteristics" or "similarities" link this "1" to that other "1". So we have a "1" and another "1" that have similarities, but are also distinct from each other. We can group them in some way "11" (unary grouping) or "2" (symbolic grouping... or whatever it's called). We can continue this grouping up and up, and even prove that this grouping can go on endlessly (infinite grouping or how we might call positive infinity). The act of grouping in mathematics is labeled addition (I didn't really have to say that, but just to show).

The process can work the other way too. We have one of that something, then we don't. We have "0" of that something. The something that was distinct is gone, but the distinction still exists in the brain. Therefor, I can say that that distinct something isn't in my possession. Not that it doesn't exist, because the concept now exists, but that there is none that I know of in the way that it is distinct (to me). Again, there is "0" of it. This zero didn't take us back to the meta-zero that we originally had, because it's still there in the mind and exists. So it exists in concept, but there is none of it here.

Now negative is more tricky, because it can itself be defined differently depending on how we want to define what happens when you conceptually remove more of this concept that exists. While I don't physically have any ("0") and I can't physically have less than that, I can conceptually have less, but it must first be defined. So in that, I agree with you that there can be many distinctions of how to treat a conceptual removal of something that conceptually that exists.

Basically, we must define -1, before we can move into that realm. For bookkeeping, a conceptual removal is money owed. This is a more complex concept then I let on earlier, but I'll try to make it brief. We we meet, we haven't given or taken money from each other. Granted I'm working on an already known concept, so saying this is "0" (and not meta-zero) is OK, because we already know the concept.

Now If I give you a dollar, you have "1" of that dollar. You physically and conceptually have that dollar. I physically don't have that dollar and from the physical amount that I have I have one less "-1". That might physically be a "0" amount that I physically have. The physical amount can't really go less than "1", but it can conceptually go to "0", and it's not defined less than "zero". I won't define it either, as that's another subject entirely. However, one less that I gave you defines a conceptual negative "-1" for how much was given. It really is the same as a positive in that it is "1" of what I gave you, but it's an equally opposite concept for the amount that you have given me. The amount you have given me is "-1". They are both opposite from your end. If you and me are switched (I become you and you become me), the amount that you gave me is "1" and the amount that I have gave you is "-1".

So in this concept, the amounts are equal and opposite. They are really both positive amounts view as opposite each other. If you were to give me back the dollar, then the amount you have given me and the amount I have given you goes to "0". If you give me a dollar, then it can be seen as a negative amount I have given you (-1) for me and a positive about that you have given me (1).

Basically, they are two conceptually opposite positive lines. They represent the physical as well, but I hope I've already shown why that is different. Negative is now defined for this. With this definition of negative, it should be a little easier to see why a square root of a negative number would equal a negative number.

Granted, negatives might be defined differently and they can, because beyond the conceptual zero, one must find what is done when it is crossed.

Is that a little better explanation? Cause that's what I'm kind of trying to work on now.
dowhat1can wrote:Yet, I do think you are correct that problems like this can arise even if we don't talk in terms of numbers existing or not existing. (So I'm thinking now that the construction of an (existential) practical example like the bookkeeping example, obfuscates the real question you are raising.)
I'm hoping that I explained it a little better above, and shown somewhat what I'm working on.
dowhat1can wrote:I was reading Frege's philosophy of mathematics this summer and was dumbfounded at his brilliance of making something out of nothing time and time again through the formulation of new notation.

After reading a couple of math history books about ten years ago, I couldn't help but think the way discovery in math often occurs is exactly the way you are approaching these problems (i.e., intuitively) and is far more likely to be fruitful than looking at foundations and axioms. Once the ideas take shape the foundations and the proofs can come later. If you were to go the books now I suspect that you might look at the brilliance of what has already been done and get discouraged. Innovation in math comes early in life not just because of the peak of thinking, but because many twenty-somethings don't know what older mathematicians "know" cannot be done and aren't mislead by what already has been done ... and do it anyway.
I guess it helps to have a bad memory then, because everything seems to be new at times, even though I might have gone over it again and again. I'm not an older mathematician, but I am older (depending on who you talk to). Old enough to have been older than one of my professors in college that had a doctorate (in computer science). Granted, he was one of the youngest people with a doctorate that I'd ever met, but that's not the point.
dowhat1can wrote:Time and time again in history, current notation gets in the way of understanding the concepts.

You might try some sort of mind map of different ways to represent different kinds of numbers. (E.g.., stuff like see what happens if you were to graph ordinary and imaginary numbers with normal x and y axes and i values on the z axis, or try graphing imaginary numbers in non-Euclidean geometries. What do imaginary numbers look like from the point of view of set theory (i.e., arbitrary closure of the set of real numbers and the set of imaginary numbers being subsets of the set of complex numbers) ? and so forth. I suspect there are five or six different ways to look at power functions and logarithms -- mind map those and ideas and approaches might suggest themselves from unexpected correlations and relationships.

I sure wish I was good at R. It would be fun to play with some of the these ideas graphically.
As far as Euclidean, yeah, I might be approaching this in a Euclidean way, but I'm going to continue either way. I don't think I'll change anything, but I'd like to get the idea solidified none-the-less. I think an awful lot. Well, probably as much as everyone else, but don't really take much time to record anything. I've been trying to lately, but nailing it down to one concept is difficult when you aren't used to writing it down.
#65584
The notion that the idea of something implies existence has been rejected by most philosophers from Kant to Kierkegaard, and this rejection gives rise to the Boolean interpretation of classical logic. Without it, logic is inconsistent. The key idea is that "existence is not a predicate." I'll not try to summarize the arguments here because there are well-edited primary sources here (a couple of pages each) at one of my favorite philosophy sites which give the reasoning: Kierkegaard, "God's Existence Cannot be Proved and Kant, "Existence is not a Predicate.
LA, (the editor of those readings) wrote:How do we talk or think about things without supposing, in some sense at least, that they exist? Bertrand Russell expressed one aspect of the problem this way: If it's false that the present King of France is bald, then why doesn't this fact imply that it's true the present King of France is not bald? When the existence of the subjects of our statements are in question, the normal use of logic becomes unreliable. Kant argues that the use of words (or "predicates") alone does not necessarily imply the existence of their referents. We can only assume the existence of entities named by our words; we cannot prove "existence" by means of the use of language alone.
So I'll just simply disagree that it's consistent to say you have something when you don't -- and leave it at that.

On another note, I did find a mathematician who, in the beginning apparently was urging, from what I can tell, the same ideas as you do in the comments above, and by all accounts he was a brilliant mathematician. This guy, Leopold Kronecker, also "got hung up" on the problem of existential import and pestered Weierstrass, Dedekind and Cantor throughout their lives. According to Berlinski, A Tour of the Calculus, by being consistent, Kronecker eventually ended up only believing natural numbers existed -- all other work of negative numbers, negative fractions, imaginary numbers, ... are the confused work of man.

When Kronecker heard Cantor retired to a mental institution, he replied, "da gehört er hin."
More later -- I'm off to work.

Current Philosophy Book of the Month

The Riddle of Alchemy

The Riddle of Alchemy
by Paul Kiritsis
January 2025

2025 Philosophy Books of the Month

On Spirits: The World Hidden Volume II

On Spirits: The World Hidden Volume II
by Dr. Joseph M. Feagan
April 2025

Escape to Paradise and Beyond (Tentative)

Escape to Paradise and Beyond (Tentative)
by Maitreya Dasa
March 2025

They Love You Until You Start Thinking for Yourself

They Love You Until You Start Thinking for Yourself
by Monica Omorodion Swaida
February 2025

The Riddle of Alchemy

The Riddle of Alchemy
by Paul Kiritsis
January 2025

2024 Philosophy Books of the Month

Connecting the Dots: Ancient Wisdom, Modern Science

Connecting the Dots: Ancient Wisdom, Modern Science
by Lia Russ
December 2024

The Advent of Time: A Solution to the Problem of Evil...

The Advent of Time: A Solution to the Problem of Evil...
by Indignus Servus
November 2024

Reconceptualizing Mental Illness in the Digital Age

Reconceptualizing Mental Illness in the Digital Age
by Elliott B. Martin, Jr.
October 2024

Zen and the Art of Writing

Zen and the Art of Writing
by Ray Hodgson
September 2024

How is God Involved in Evolution?

How is God Involved in Evolution?
by Joe P. Provenzano, Ron D. Morgan, and Dan R. Provenzano
August 2024

Launchpad Republic: America's Entrepreneurial Edge and Why It Matters

Launchpad Republic: America's Entrepreneurial Edge and Why It Matters
by Howard Wolk
July 2024

Quest: Finding Freddie: Reflections from the Other Side

Quest: Finding Freddie: Reflections from the Other Side
by Thomas Richard Spradlin
June 2024

Neither Safe Nor Effective

Neither Safe Nor Effective
by Dr. Colleen Huber
May 2024

Now or Never

Now or Never
by Mary Wasche
April 2024

Meditations

Meditations
by Marcus Aurelius
March 2024

Beyond the Golden Door: Seeing the American Dream Through an Immigrant's Eyes

Beyond the Golden Door: Seeing the American Dream Through an Immigrant's Eyes
by Ali Master
February 2024

The In-Between: Life in the Micro

The In-Between: Life in the Micro
by Christian Espinosa
January 2024

2023 Philosophy Books of the Month

Entanglement - Quantum and Otherwise

Entanglement - Quantum and Otherwise
by John K Danenbarger
January 2023

Mark Victor Hansen, Relentless: Wisdom Behind the Incomparable Chicken Soup for the Soul

Mark Victor Hansen, Relentless: Wisdom Behind the Incomparable Chicken Soup for the Soul
by Mitzi Perdue
February 2023

Rediscovering the Wisdom of Human Nature: How Civilization Destroys Happiness

Rediscovering the Wisdom of Human Nature: How Civilization Destroys Happiness
by Chet Shupe
March 2023

The Unfakeable Code®

The Unfakeable Code®
by Tony Jeton Selimi
April 2023

The Book: On the Taboo Against Knowing Who You Are

The Book: On the Taboo Against Knowing Who You Are
by Alan Watts
May 2023

Killing Abel

Killing Abel
by Michael Tieman
June 2023

Reconfigurement: Reconfiguring Your Life at Any Stage and Planning Ahead

Reconfigurement: Reconfiguring Your Life at Any Stage and Planning Ahead
by E. Alan Fleischauer
July 2023

First Survivor: The Impossible Childhood Cancer Breakthrough

First Survivor: The Impossible Childhood Cancer Breakthrough
by Mark Unger
August 2023

Predictably Irrational

Predictably Irrational
by Dan Ariely
September 2023

Artwords

Artwords
by Beatriz M. Robles
November 2023

Fireproof Happiness: Extinguishing Anxiety & Igniting Hope

Fireproof Happiness: Extinguishing Anxiety & Igniting Hope
by Dr. Randy Ross
December 2023

2022 Philosophy Books of the Month

Emotional Intelligence At Work

Emotional Intelligence At Work
by Richard M Contino & Penelope J Holt
January 2022

Free Will, Do You Have It?

Free Will, Do You Have It?
by Albertus Kral
February 2022

My Enemy in Vietnam

My Enemy in Vietnam
by Billy Springer
March 2022

2X2 on the Ark

2X2 on the Ark
by Mary J Giuffra, PhD
April 2022

The Maestro Monologue

The Maestro Monologue
by Rob White
May 2022

What Makes America Great

What Makes America Great
by Bob Dowell
June 2022

The Truth Is Beyond Belief!

The Truth Is Beyond Belief!
by Jerry Durr
July 2022

Living in Color

Living in Color
by Mike Murphy
August 2022 (tentative)

The Not So Great American Novel

The Not So Great American Novel
by James E Doucette
September 2022

Mary Jane Whiteley Coggeshall, Hicksite Quaker, Iowa/National Suffragette And Her Speeches

Mary Jane Whiteley Coggeshall, Hicksite Quaker, Iowa/National Suffragette And Her Speeches
by John N. (Jake) Ferris
October 2022

In It Together: The Beautiful Struggle Uniting Us All

In It Together: The Beautiful Struggle Uniting Us All
by Eckhart Aurelius Hughes
November 2022

The Smartest Person in the Room: The Root Cause and New Solution for Cybersecurity

The Smartest Person in the Room
by Christian Espinosa
December 2022

2021 Philosophy Books of the Month

The Biblical Clock: The Untold Secrets Linking the Universe and Humanity with God's Plan

The Biblical Clock
by Daniel Friedmann
March 2021

Wilderness Cry: A Scientific and Philosophical Approach to Understanding God and the Universe

Wilderness Cry
by Dr. Hilary L Hunt M.D.
April 2021

Fear Not, Dream Big, & Execute: Tools To Spark Your Dream And Ignite Your Follow-Through

Fear Not, Dream Big, & Execute
by Jeff Meyer
May 2021

Surviving the Business of Healthcare: Knowledge is Power

Surviving the Business of Healthcare
by Barbara Galutia Regis M.S. PA-C
June 2021

Winning the War on Cancer: The Epic Journey Towards a Natural Cure

Winning the War on Cancer
by Sylvie Beljanski
July 2021

Defining Moments of a Free Man from a Black Stream

Defining Moments of a Free Man from a Black Stream
by Dr Frank L Douglas
August 2021

If Life Stinks, Get Your Head Outta Your Buts

If Life Stinks, Get Your Head Outta Your Buts
by Mark L. Wdowiak
September 2021

The Preppers Medical Handbook

The Preppers Medical Handbook
by Dr. William W Forgey M.D.
October 2021

Natural Relief for Anxiety and Stress: A Practical Guide

Natural Relief for Anxiety and Stress
by Dr. Gustavo Kinrys, MD
November 2021

Dream For Peace: An Ambassador Memoir

Dream For Peace
by Dr. Ghoulem Berrah
December 2021


The more I think about this though, many peopl[…]

Wow! This is a well-articulated write-up with prac[…]

@Gertie You are quite right I wont hate all […]

thrasymachus We apparently have different[…]