Q: @RSLT It’s very impressive that you made three different proofs on one theorem. I would like to discuss the limit first, then the differences between limit and analytic continuation.
First, taking a variable to infinity is very dangerous. Many math tools won’t work properly in this situation, as this situation is generally not considered. For instance, L'Hôpital's rule may give a false result. Consider x/x² = x/x² , let x go to ∞, apply L'Hôpital's rule on the RHS, gives: x/x² = 1/(2x). Multiply both sides by x, we get 1 = 1/2, which is a clear contradiction. This example shows that anything proved by L'Hôpital's rule may not hold at infinity case. What I am trying to argue is that “let b go to ∞ in this equation“ may get you to a false proof as a lot of theorems malfunction and you don’t know which one goes wrong.
Furthermore, classical limit is sometimes not analytic. Right at the first step of equation (13), you used: Σₖ (-1)ᵏ f(k) = Σₖ[ 2f(2k)-f(k) ] while trying to convert the alternating sum to the difference of two sums. This, is not guaranteed if you have Σₖ f(k) diverges. In fact, the complete formula is the following:Σₖ (-1)ᵏ f(k) = Σₖ[ 2f(2k)-f(k) ] - lim{n→∞} Σₖ{n→2n} f(2k)where the limit converges to zero if Σₖ f(k) converges. To show how it works, considerΣₖ{≥0} (-1)ᵏ/(k+1) = Σₖ{≥0}[ 2/(2k+1) - 1/(k+1) ] = ψ(1)-ψ(1/2)which is not true. The mistake is that Σₖ{≥0} 1/(k+1) diverges so you need to compute the limit. Your mistake is analogous to the example above, and the limit that you didn’t calculate may not be zero. You were trying to prove Σₙ (-1)ⁿ⁻¹/nˢ = Σₙ[ 1/nˢ - 2/(2n)ˢ ], which is not true for s=1 (LHS = ln2, RHS = 0). When it comes to the edge of convergence, the summation may suddenly not be analytic.
You can’t take analytic continuation all the time. Analytic continuation ensures that you can’t have two different ways of analytic continuation that results differently, however, in the video, you were taking the classical limit on the equation. Even if you proved the formula is true for s>1, you can’t just say “let me take the analytic continuation“. Otherwise, can you explain why your formula in equation(13)(equating the LHS between step 1 and 2, canceling out α⁻¹) is true for everything except s=1? Perhaps you can prove that based on the restriction ζ(s)=ζ(1-s*). Your other proofs are all proving the analytic continuation, not the original sum.
In conclusion, you let b go to infinity, and viewed the limit as analytic continuation (which is not true), then applied the formula for analytic continuation and thus got a false proof.
If you are confident that you made a correct proof that has avoided the problem that I just mentioned, it is great to discuss it in the comments. I really enjoyed the argument with you too!
A: I agree, and indeed, I’m one of the advocates who believe we need to seriously look into the definition of limits. You brought up the example of L'Hôpital's Rule, and I believe the situation is even worse than you described because and it is a real problem. As you mentioned, it gives you disaster answer.
\[ \lim_{x \to \infty} \frac{x}{\sin(x) + x} \]
I'm aware of the limit definition limit, the issues are well understood. It is a flawed tool, but still useful if used carefully enough. It is like a car that works in first gear but starts failing in higher gears. It will get us somewhere.
Before we get to the answer, here are a few notes:
I like to use this analogy: My goal is to issue a valid license for RH. I know that some people are mistakenly called 'Indian,' and it has nothing to do with India. My job here is to issue a driver's license for a Native American, and it really has nothing to do with their race. What I mean by this is that what we call something infinite, Infinity , analytic continuation, divergence, range functions, and the limit definition—has major flaws, and we could literally be mean something else when we use them. However, I have proven that those flaws cannot undermine my proof, using these cautionary continuity actions.
One, most people call something boundless infinity without caring about the kind or size. But there are different sizes and kinds of infinity—real infinity, complex infinity, and my addition of super infinities. Instead of arguing about what kind of infinity it is, I keep them under control and finite. In other words, in RSLT, we prove functions for arbitrarily large numbers. I have set functions where it is true for b=1, then b=2, and so on, thus incorporating the kind and size of infinities into the destination equation. In other words, if we prove something for one apple, then for two apples, and thus for arbitrarily large apples. Some errors you listed come from mixing up apples and oranges. I didn't explain why apples or why oranges, because it is cumbersome and no need to make things more complex for a topic that is already complicated.
Two, the method I have used in RSLT, and which I believe I have pioneered, involves dealing with infinites using induction and correct use of algebraic limit theorem . I do not let function become infinite but instead keep them finite. Once I’m done with equation and we have what I need, then I take the limit, showing a continuous path to what I have stated. I prove that it is true for the variable when it is one, then for the same function when the variable is two, and so on, up to infinity. This allows us to perform algebra on divergence functions. I know we cannot do algebra on divergence equations, but you can do it on all RSLT equations.
Three, I have at least two independent proofs for everything, and RSLT always has two proofs. It started with two zeta functions—see equation 3 and refer to the structure on the page.
Four, when something is unclear or vague, I go back to basics until it becomes irrefutable if needed only (-+/x) and one-to-one correspondence . See here for more details and the method of obtaining the analytic continuation of the zeta function to the critical line using +, –, x, and / only.
Having stated the above rules/ notes (and you don't need to memorize it), I will counter your argument regardless below.
Q : For instance, L'Hôpital's rule may give a false result. Consider x/x² = x/x² , let x go to ∞, apply L'Hôpital's rule on the RHS, gives: x/x² = 1/(2x). Multiply both sides by x, we get 1 = 1/2, which is a clear contradiction. This example shows that anything proved by L'Hôpital's rule may not hold at infinity case. What I am trying to argue is that “let b go to ∞ in this equation“ may get you to a false proof as a lot of theorems malfunction and you don’t know which one goes wrong.
A: First, It is true that we have two divergence function Σₙ[1/n^s] and b^(1-s)/(1-s) in the critical strip. However we have the transcendental zeta function ζ(s) = Σₙ[1/n^s] - b^(1-s)/(1-s), TZF absolutely converges on the critical strip. (As usual, there are two proofs and a 10K plus 10K bounty on it). TZF shows that these arbitrarily large values behave well, and there is no ambiguity about their combined characteristics in critical strip. How do we know? Because we have continuous paths to every single point where it is true for b=1, then b=2, and thus for any arbitrary large number hence TZF.
Q: Furthermore, classical limit is sometimes not analytic. Right at the first step of equation (13), you used: Σₖ (-1)ᵏ f(k) = Σₖ[ 2f(2k)-f(k) ] while trying to convert the alternating sum to the difference of two sums. This, is not guaranteed if you have Σₖ f(k) diverges. In fact, the complete formula is the following: Σₖ (-1)ᵏ f(k) = Σₖ[ 2f(2k)-f(k) ] - lim{n→∞} Σₖ{n→2n} f(2k) where the limit converges to zero if Σₖ f(k) converges. To show how it works, consider Σₖ{≥0} (-1)ᵏ/(k+1) = Σₖ{≥0}[ 2/(2k+1) - 1/(k+1) ] = ψ(1)-ψ(1/2) which is not true. The mistake is that Σₖ{≥0} 1/(k+1) diverges so you need to compute the limit. Your mistake is analogous to the example above, and the limit that you didn’t calculate may not be zero. You were trying to prove Σₙ (-1)ⁿ⁻¹/nˢ = Σₙ[ 1/nˢ - 2/(2n)ˢ ], which is not true for s=1 (LHS = ln2, RHS = 0). When it comes to the edge of convergence, the summation may suddenly not be analytic. You can’t take analytic continuation all the time.
A: Before we move on to 13, there is some text regarding one-to-one correspondence or bijection. In Equation 12, you can see that I addressed the problem you mentioned earlier (here is the page for it with a video). Now, moving on to 13, remember I said I would go back to basics. This is one of the times when analytic continuation is ambiguous, so we return to the fundamentals.
Consider the sum 1^0.5 + 2^0.5 + ... = infinity, which is equivalent to ζ(1/2) = -1.4603545088.... What I have demonstrated is 12 that certain arbitrarily large numbers (or, to be more precise, 'super infinities') are equivalent to a specific finite number ( one to one correspondence) .
Is Equation 12 a new way of analytic continuation? Maybe. Should we call it a super-super operation? Maybe. My goal is to prove the Riemann Hypothesis, and I don't want to get into debates about naming conventions or differences between Indian and Native American concepts. It’s just a name. The important point is that section 12 shows how specific infinities are mapped to specific values without making the errors you mentioned.
Now, I said all this to explain that in section 13, I used simplified steps from section 12. I could show all the steps from section 12, but it would be exhaustive and wouldn't fit on the page. However, if you'd like, I can go over them. The point is 12 is correct and 13 is simplified version of it. The point of 13 is to show that Σₙ 1/n^s = Σₙ 1/n^(1-s*) when ζ(s) = ζ(1 - s*).
The short answer is: 1/n^s * α =1/n^s * (1-2/2^s) =1/n^s - 2/(2^s*n^s) = 1/n^s - 2/(2n)^s. so it is true because of proof I have just stated. What I was referring to, while having a bit of fun and hiding an Easter egg in the RSLT proof, is that Ramanujan's method, although not rigorous, is still correct in this case. He was able to manipulate infinite series in a way that was later proven to be valid, as shown in the proof I provided in section 12.
Q : Analytic continuation ensures that you can’t have two different ways of analytic continuation that results differently, however, in the video, you were taking the classical limit on the equation. Even if you proved the formula is true for s>1, you can’t just say “let me take the analytic continuation“. Otherwise, can you explain why your formula in equation(13)(equating the LHS between step 1 and 2, canceling out α⁻¹) is true for everything except s=1? Perhaps you can prove that based on the restriction ζ(s)=ζ(1-s*). Your other proofs are all proving the analytic continuation, not the original sum.
A: It is agreed that analytic continuation will not produce different results; this has not occurred. However, as mentioned, section 17 specifically uses the uniqueness of analytic continuation to prove SSE. For example, analytic continuation ensures that 1^0.5 + 2^0.5 + ... = infinity, which is equivalent to ζ(1/2) = -1.4603545088..., and no other number. The method used to develop the analytic continuation is consistent with all other analytic continuations of the zeta function. The alternative zeta function is equal to abc and the transcendental zeta function; there is no difference in value.
We only have one way to generate the prime counting function: Euler's product formula, which is equal to the Riemann zeta function, where both functions diverge. The zeros of the zeta function, where these two functions diverge, lead to the famous Riemann-Von Mangoldt explicit formula and other similar results.
In conclusion, no, there is no evidence that RSLT proofs could be incorrect (at least, we haven’t seen it yet). The reason for my confidence is that it has redundancy, meaning that even if one approach is disproven, there is still a path to prove and establish the result (please see structure page). Moreover, if we cannot support our disproof claims with a simple counterexample, it strongly indicates that a fundamental part is missing. We know that no numeric counterexample is not a proof, but it shows that with each view of RSLT videos increasing, statistically, it will be harder to disprove RSLT. Lastly, I used well-proven facts and functions, and if none, at least for one of the proofs, I use +, –, x, /, and one-to-one correspondence induction to avoid the errors you mentioned. If these simple operations fail, we are in trouble in math.
Side note not related directly : The fact that the series 1^2 + 2^2 + ... approaches infinity and is said to equal zero is not a contradiction. It simply shows that we don't fully understand infinity and how it works. This is crucial because if we could prove it to be a contradiction, the entire premise of prime numbers and their relation to the zeros of the zeta function would be invalid. In other words, the foundation of prime number theory based on the zeta function would be flawed. Because we only have one way to generate the prime counting function: Euler's product formula, which is equal to the Riemann zeta function, where both functions diverge. The zeros of the zeta function, where these two functions diverge, lead to the famous Riemann-Von Mangoldt explicit formula and other similar results.