## You are here

Homestatistics on PlanetMath

## Primary tabs

# statistics on PlanetMath

This entry is supposed to become a textbook on statistics – or at least a guide to the statistics-related entries on PlanetMath. It will refer to related entries (e.g. from probability) that are not listed in the same MSC.

(This is currently a stub article. See also Textbook projects on PlanetMath.)

Type of Math Object:

Topic

Major Section:

Reference

Groups audience:

## Mathematics Subject Classification

62-01*no label found*

- Forums
- Planetary Bugs
- HS/Secondary
- University/Tertiary
- Graduate/Advanced
- Industry/Practice
- Research Topics
- LaTeX help
- Math Comptetitions
- Math History
- Math Humor
- PlanetMath Comments
- PlanetMath System Updates and News
- PlanetMath help
- PlanetMath.ORG
- Strategic Communications Development
- The Math Pub
- Testing messages (ignore)

- Other useful stuff
- Corrections

## Comments

## too popular?

This is version 4 of statistics on PlanetMath, born on 2004-11-13, modified 2006-11-27.

Object id is 6474, canonical name is StatisticsOnPlanetMath.

Accessed 1316 times total.

Why so many times accessed and still no major revisions?

## Re: too popular?

We could FDL-copy from Wikipedia or Rusin's page to get things moving!

apk

## Re: too popular?

> Why so many times accessed and still no major revisions?

It might help if the author of this entry would provide some seed

content in the form of a list of PM entries on the subject of

statistics. Given his experience editing FEM, this should not be so

hard. My personal experience has been that, when I have started an

entry with a list of such entries, other people would then add items,

organize content, and add explanatory text. However, given the

absence of initial content, what happened here is pretty much what I

expect.

## Re: too popular?

That makes sense, however one might hope (I guess maybe

I did) that posting a pure stub might provoke others

to generate the actual content. Maybe this would

work better if there was a "bare stub" category of entry,

something that I think has been suggested before.

## Re: too popular?

> "bare stub" category of entry

Isn't that basically what the entry requests lists does?

## Re: too popular?

> > "bare stub" category of entry

>

> Isn't that basically what the entry requests lists does?

Yes, but shouldn't be the only

part of PM that has this function.

## Re: too popular?

I would discourage people from creating "bare stub" entries unless they plan on writing and contributing a substantial amount to the entry. Otherwise, people will also use this mechanism to get points for doing no work at all.

I think the requests list already provides a place where people can point out holes in the encyclopedia.

A

## Re: too popular?

> people will also use this

> mechanism to get points for doing no work at all.

The point-metric could be set up so that a "stub" was

worth far less than a full entry.

## Game

You are playing a game in which you can ask for numbers. Each such number is a random value uniformly (and independently) distributed between 0 and 1. After you receive each number you can decide whether to request an additional number or to stop. When you stop your score is the sum of all the numbers you have received. Let 0<x<1. We ask two questions.

1. Suppose you are trying to achieve a score between x and 1. What are your chances of success?

2. Suppose you are trying to achieve a score between n+x and n+1. What are your chances of success in the limit as n --> infinity?

## Re: Game

I need some clarification:

> After you receive each number you can decide whether to request an additional number or to stop.

Do you know what the number is once you receive it? Or is it kept secret? I am guessing that you know the number when you receive it, mainly because that would make the problem much easier!

## Re: Game

I assume(may be wrong), you know the number when you recive it.

Assuming that we are being told the number, you can approach to the solution

## Re: Game

As I do not have TeX on my computer and the calculations are kind of messy, I put my thoughts in a collaboration (to be deleted once you are satisfied unless someone absolutely protests):

http://planetmath.org/?op=getobj&from=collab&id=108

## Re: Game

Thanks a lot Wkbj79,

It is indeed a fantastic piece of work by you. You can go ahead and remove that link. But can you please also answer the second question i.e.

2. Suppose you are trying to achieve a score between n+x and n+1. What are your chances of success in the limit as n --> infinity?

## Re: Game

> But can you please also answer the second question...

I hope to look at it at some point. It is bedtime though. (Right now, it is 12:50am here.) Unless I suffer from insomnia due to thinking about that problem too much, I will not get to looking at that problem until I get up in the morning/afternoon. I will put all of my thoughts in the "sandbox" so that I will not have to make a new one for the second problem.

Good night.

-Warren

## Re: Game

Good night, Warren

Sweet dreams!!

## Wkbj79's sandbox

> ....I put my thoughts in a collaboration (to be deleted once you are satisfied unless someone absolutely protests):

http://planetmath.org/?op=getobj&from=collab&id=108

Now I am curious about something: Can anyone else besides me access this from the collaborations menu (on the left hand side of the screen)? If not, then I am guessing that description of my sandbox are not viewable either. I would hate for one of my most creative moments to go to waste....

## Re: Game

According to Warren's solution given for Question 1; grabbing from his sandbox http://planetmath.org/?op=getobj&from=collab&id=108

Can any body please answer question 2, using ideas for his solution posted for 1 as ingredients

## Re: Game

Can you please asnwer question 2. Warren has already worked out but something is going wrong with second one

http://planetmath.org/?op=getobj&from=collab&id=108

## Re: Game

If anyone can please direct me to correct answer for second question, it will be highly appreciated

## Re: Game

According to Warren's solution given for Question 1; grabbing from his sandbox http://planetmath.org/?op=getobj&from=collab&id=108

Can any body please answer question 2, using ideas for his solution posted for 1 as ingredients

## Re: Game

According to Warren's solution given for Question 1; grabbing from his sandbox http://planetmath.org/?op=getobj&from=collab&id=108

Can any body please answer question 2, using ideas for his solution posted for 1 as ingredients

Any help will be sincerely appreciated. I know you all are very busy and you have been always kind in helping people. Can you please spare some time for this?

## ponder september 2007

A cable TV company has two channels of programming that it can offer (at negligible marginal cost) to a large pool of potential customers. Suppose the value (per unit time) of each channel to each potential customer is an independent random variable uniformly distributed on the interval (0,1) and that value is additive (so the value of both channels to the potential customer is the sum of the values of each individual channel). Assume a potential customer will

buy when the (additional) value is greater than the (additional) price.

1. Suppose the company prices each channel independently. What should it charge for each channel (to maximize revenue) and what will its expected revenue be (per potential customer per unit time)?

2. Suppose the company bundles the channels together so the potential customer has to buy both or neither. What should it charge (to maximize revenue) for the bundle and what will its expected revenue be (per potential customer per unit time)?

Define the consumer surplus to be the difference (assuming it is positive) between what a potential buyer would have been willing to pay for an item and the actual price. If the difference is negative the sale will not take place and the consumer surplus is defined to be 0.

3. What is the consumer surplus (per potential customer per unit time) assuming the cable company prices to maximize revenue for cases 1 and 2 above?

## Re: ponder september 2007

Hi Parashar, this time you indicate clearly the source of the problem you are proposing. This is a typical engineering problem which suits very well the IBM site. I doubt that many people will be interested here. For me, I got lost before reaching the second question (and I am an engineer...).

That said, I must thank you very much for your previous problem concerning an infinite product. It kept me busy for more than two weeks. I can hardly remember having enjoyed overcoming such a challenge. In fact, I solved it twice by two different methods.

I started the hard way, taking the log to convert the product into a sum (most people are more confortable with infinite sums). Then, I expanded the log in a Taylor series, getting a double sum. After multiplying by the powers of an auxiliary variable and differentiating, one of the sums turned to be a simple geometric series; so I was left with an infinite sum and an integral. I managed to compute the sum by contour integration in the complex plane! (a method described by Morse and Feshbach in " Methods of theoretical physics"). I struggled a while with a weird integral until I succeeded to reduce it to:

Integral (0->pi/2) xcotan(x)dx.

I finally managed to compute this integral (ln(2)pi/2) in a very unusual way. At this time Perucho posted his approximate evaluation and I was able to answer that he was not even close. Somebody else also posted a wrong answer based on Mathematica.

Next day, after the euphoria of the triumph faded out, I got the unconfortable feeling that I missed something. There must be a much simpler and direct way to handle this product. So I tried to evaluate the product directly, up to N factors instead of infinity. I succedeed indeed to get a closed formula, full of factorials and powers. I could not see a simple way to find the limit. Mostly, I was puzzled by the fact that I knew that there was a pi factor in the final answer; I could not see how this factor could come out from a ratio of integers.

Suddenly eureka: Stirling! This is an approximate formula which becomes exact at the limit when n goes to infinity. And I was just on my way to infinity! I found there the pi factor as expected.

This problem is so fascinating that I think it deserves recalling it; maybe with a hint: computing first the product up to N terms.

## Re: ponder september 2007

@dh2718

Can you please send me the complete answer with all the steps of previous problem, so that I cna try to find some alternate way out of it. Please mention all the steps correctly till the final answer.

## Re: ponder september 2007 (infinite product)

Hi dh2718,

>At this time Perucho posted his approximate evaluation and I was able to answer >that he was not even close.

Certainly! I tell you what happened to me. When I was computing that infinite product I recalled a case which is related to gamma function on the z-plane, but in this case I was investigating convergence and not the limit value itself. That it was my lapsus. D'Ambrosio already had pointed out my mistake. I apologize for my late reply as I have been working out complemeting mathcam's entry about ``matrix exponential'' and also in my new entry (Airy functions). So I have stopped my work by a moment in order to reply you. Well my friend, I have reviewed my calculations and I have realized that you're right in that the little hard part is about extracting the \pi-term. I did what it follows (you may take pencil & paper).

P=\Pi_{n=2}^\infty (1/e).[n^2/(n^2-1)]^{n^2-1}.

Let's take the n-partial product (it is easier to work with the inverse)

P_n=1/{e^n.\Pi_{k=2}^{n+1}[(k^2-1)/k^2]^{k^2-1}}.

It's clear that $P=\limit_{n\to\infty}P_n$. By taking logarithms

\log{P_n}=-n-\sum_{k=2}^{n+1}(k^2-1)\log[(k^2-1)/k^2]=

-n-\sum_{k=2}^{n+1}(k^2-1)[\log{k+1}+\log{k-1}-2\log{k}]=

-n-\sum_{k=3}^{n+2}[(k-1)^2-1]\log{k}-\sum_{k=1}^n[(k+1)^2-1]\log{k}+

\sum_{k=2}^{n+1}(k^2-1)(2\log{k})=

-n-\sum_{k=3}^n(k^2-2k)\log{k}-(n^2+2n)\log{n+2}-(n^2-1)\log{n+1}-

\sum_{k=3}^n(k^2+2k)\log{k}-3\log{1}-8\log{2}+

\sum_{k=3}^n(k^2-1)(2\log{k})+6\log{2}+(n^2+2n)(2\log{n+1}).

By simplifying and putting $-2\log{2}$ at the resultant sum, we have

\log{P_n}=-n-\sum_{k=2}^n(2\log{k})+(n^2+2n)\log{(n+1)/(n+2)}+(2n+1)\log{n+1}.

Rearrangement and splitting these terms it's not so easy too. We will working out $-n-\sum_{k=2}^n(2\log{k})$ first(recall that $\log{e}=1$ and $\log{1}=0$, terms that we will introduce). So,

-n-\sum_{k=2}^n(2\log{k})=-\sum_{k=1}^n(\log{k^2}+log{e})=

=-\sum_{k=1}^n\log{ek^2}=-\log{\Pi_{k=1}^n(ek^2)}=-\log{e^n.(n!)^2}.

So we call now to Mr. Stirling: n!~\sqrt{2\pi}.n^{n+1/2}.e^{-n}.

We haven't here any problem as there are not sum (my lapsus!). Hence,

-n-\sum_{k=2}^n(2\log{k})=-\log{e^n.(2\pi.n^{2n+1}.e^{-2n})}=

-\log{2\pi.e^{-n}.n^{2n+1}}=-\log{2\pi}+n-(2n+1)\log{n+1},

which is substituted into the above equation to give us

\log{P_n}=-\log{2\pi}+n-(2n+1)\log{n+1}+(n^2+2n)\log{(n+1)/(n+2)}+(2n+1)\log{n+1}.

We must now take care in splitting this thing. On this way:

\log{P_n}=-\log{2\pi}+[n+(n^2-1)\log{(n+1)/(n+2)}]+

{(2n+1)[\log{(n+1)/(n+2)}+\log{(n+1)/n}]}.

We are now prepared to take each limit. The former (l'Hospital throughout),

[n/(n^2-1)+\log{(n+1)/(n+2)}]/[1/(n^2-1)]=...=

(-3n^3-5n^2-3n-1)/[-(2n^3+6n^2+4n)]

which yields to $3/2$ as $n$ yields to $\infty$. The latter one,

[\log{(n+1)/(n+2)}+\log{(n+1)/n}]/[1/(2n+1)]=..=

-1/2[(4n^2+4n+1)/(n^2+3n+2)-(4n^2+4n+1)/(n^2+n)],

which yields to $0$ as $n$ yields to $\infty$. Thus,

\log{P}=-\log{2\pi}+3/2,

or

P=e^{3/2}/(2\pi).

BTW, I have not been able to find the answer. Where you obtained it?

Cheers,

perucho

## Re: ponder september 2007 (infinite product)

Hi Daniel and Percuho

Thanks foir your answers on infinite product post. I think now we should step on to Ponder September 2007 problem, what do you think?

## Re: ponder september 2007

Any comments??

## Re: ponder september 2007

I told you that not many people would be interested (I am not)...

## December Ponder

Suppose we have a large number, n, of parts and know at least one is defective. If we can test any number of parts at once (learning whether they are all good or at least one is defective) then it is well known that a binary search will identify a defective part after log2(n) tests.

This month's puzzle concerns a variation on this problem in which the defective part only fails intermittently. Assume when the bad part is tested (by itself or with other parts) it fails half the time and passes half the time. The following questions concern how many tests on average it takes various algorithms to identify a single defective part out of a large number n.

Suppose we have a large number, n, of parts and know at least one is defective. If we can test any number of parts at once (learning whether they are all good or at least one is defective) then it is well known that a binary search will identify a defective part after log2(n) tests.

This month's puzzle concerns a variation on this problem in which the defective part only fails intermittently. Assume when the bad part is tested (by itself or with other parts) it fails half the time and passes half the time. The following questions concern how many tests on average it takes various algorithms to identify a single defective part out of a large number n.

You should ignore any lower order terms arising from the fact that you can only test integral numbers of parts at once. Express the answers as C*log2(n) where C is a constant given to 4 decimal places (ie 1.0000 for the original problem).

1) Suppose we use the following algorithm. Select a subgroup of size .5*n at random and test it. If it passes repeat with different random subgroups of size .5*n until you find a subgroup which fails. Then recursively apply the algorithm to smaller groups until you have isolated the bad part. How many tests on average will this take?

2) Use the algorithm in 1) except you test random subgroups of size a*n instead of .5*n. What is the optimal value of a and how many tests on average will be required?

3) Divide the parts at random into 2 equal subgroups and test them alternatively until one fails. Then apply the algorithm recursively. How many tests on average will be required to isolate the bad part.

4) Same as 3) except at each stage the parts are divided into 3 equal subgroups which are tested in turn until one fails. Again how many tests on average will be needed to find the bad part?

5) (Not required for credit) The algorithm in 4 is pretty good but not optimal. How many tests does the optimal algorithm require?

## Re: December Ponder

Anyone interested?

## Perfect solution for your business statistics needs

I found this website that's been really productive for me, and feel the need to introduce this site to anyone that is interested in Warehouse Designing, and E-Commerce.

Link: http://www.thecyberprofessor.com

## Re: December Ponder

Anyone interested? problem 5 is really a good one

## Re: December Ponder

Guide to solution for problem no. 5-> http://en.wikipedia.org/wiki/Noisy_channel_coding_theorem

## January Ponder this

The most easy ponder problem till now is here:

This month's puzzle concerns drawing balls from an urn. You know the urn contains 3 balls. You know each ball is white or black and that white balls are worth 1 unit, black balls are worth nothing. You know the urn is equally likely to contain 0,1,2 or 3 white balls. You have an opportunity to buy balls from the urn. Each ball costs c units (0<c<1). When you buy a ball one of the remaining balls in the urn is randomly selected and given to you. You may stop buying balls at any time based on the color of the balls you have received. Assuming you adopt the best strategy, what is your expected return as a function of c?

Is anyone interested

## March Ponder This

S, has a good, G, which he is considering selling to B. The values of G to S and B are independent uniformly distributed random variables between 0,1. S and B know this and know their own valuations but not the valuation of the other party.

1. Suppose B makes a single offer which S accepts or rejects depending on whether or not the offer exceeds the value S places on the item. What should B offer to maximize his expected gain (the difference when a sale occurs between B's valuation of G and the sales price, 0 when no sale occurs) as a function of B's valuation of G? What is B's expected gain? What is S's?

2. Suppose there are 2 buyers B1, B2 (each with a valuation independently uniformly distributed between 0 and 1). Suppose each makes a single bid for G and S accepts the larger if and only if it exceeds S's valuation. Find a bidding strategy for B1 and B2 which is optimal in that if one adopts it the other can do no better than to adopt it also. What will be the expected gains of B1, B2 and S?

## Re: March Ponder This

Any ideas please??

## Matrix

There is a 20x9 state table. Every cell determines its value (0 or 1) based on its eight neighbors: a cell having value 0 will change its value to 1 if and only if exactly three of its neighbors are having value 1; a cell having value 1 will remain to 1 if and only if two or three of its neighbors are having value 1.

All the changes are done simultaneously.

Find a possible state of the table whose next state contains 1s at following positions and zeros at rest of the positions.

[3,3], [3,4], [3,5], [3,8], [3,9], [3,10], [3,14], [3,18],

[4,9], [4,11], [4,11], [4,14], [4,15], [4,17], [4,18],

[5,4], [4,9], [4,10], [4,14], [4,16], [4,18],

[6,4], [6,9], [6,11], [6,14], [6,18],

[7,3], [7,4], [7,5], [7,8], [7,9], [7,10], [7,14], [7,18]

## Re: Matrix

> There is a 20x9 state table. Every cell determines its value

> (0 or 1) based on its eight neighbors: a cell having value 0

> will change its value to 1 if and only if exactly three of

> its neighbors are having value 1; a cell having value 1 will

> remain to 1 if and only if two or three of its neighbors are

> having value 1.

> All the changes are done simultaneously.

OK, that's Conway's Game of Life on a 20x9 chessboard.

> Find a possible state of the table whose next state contains

> 1s at following positions and zeros at rest of the

> positions.

> [3,3], [3,4], [3,5], [3,8], [3,9], [3,10], [3,14], [3,18],

> [4,9], [4,11], [4,11], [4,14], [4,15], [4,17], [4,18],

> [5,4], [4,9], [4,10], [4,14], [4,16], [4,18],

> [6,4], [6,9], [6,11], [6,14], [6,18],

> [7,3], [7,4], [7,5], [7,8], [7,9], [7,10], [7,14], [7,18]

I think you meant:

[3,3], [3,4], [3,5], [3,8], [3,9], [3,10], [3,14], [3,18],

[4,4], [4,9], [4,11], [4,11], [4,14], [4,15], [4,17], [4,18],

[5,4], [5,9], [5,10], [5,14], [5,16], [5,18],

[6,4], [6,9], [6,11], [6,14], [6,18],

[7,3], [7,4], [7,5], [7,8], [7,9], [7,10], [7,14], [7,18]

so that you form the word "IBM" :)

Well, I think the trick should be to prove that there is no such "predecessor" for this state.

However, a full computation would require C*2^180 computations, so it's unfeasible.

A trick could be to prove that one of the letters I, B, M is unreachable...

## Re: Matrix

you are right silvio

Actually I wanted to post the exact problem, and that is why I asked if anybody knows how to add attachment to the post so that I could attach that picture of I B M.

Also your answer is right, I have proved it through a program in Perl that no such predecessor exists. But the thing is it need to be proved analytically or statistically rather than through program.

Do you have any idea?

## June Ponder This

This month's problems concern DNA testing. DNA tests can exclude people as the source of a DNA sample. Assume that the tests will never exclude the actual source but sometimes will fail to exclude someone who is not the actual source but who by chance happens to match the actual source at the DNA locations being tested.

1. Suppose before DNA testing we estimate X has p chance of being the actual source. Suppose there is a random match probability of 1/n. If testing does not exclude X what is our new estimate of the probability that X is the source?

2. Suppose we have a database of DNA data from k people. Suppose before comparing with the DNA sample we estimate that there is a chance p that the database contains the actual source and further that every person in the database is equally likely to be the actual source. Assume as above the random match probability for someone who is not the actual source is 1/n and that this probability is independent for multiple people who are not the actual source. Suppose we find exactly one person, X, in the database whose DNA matches the sample. What is our new estimate of the probability that the database contains the actual source (and therefore that the actual source is X).

3. In an actual case k was 338000 and the random match probability was said to be 1/1100000. Suppose we further assume (before checking) that there is a 20% chance that the actual donor is in the database and that everyone in the database is equally likely to be the actual donor. Suppose (as in the actual case) exactly one person, X, matching the sample is found in the database. Subject to the above assumptions, what is the probability that X is the actual donor? Give the probability rounded to four decimal places.

## Re: June Ponder This

Anyone wants to share something?

## Re: June Ponder This

PARASHAAAR! Hi' doing!

My dear friend, the probability that someone in PM be interested for this kind of problems, is approximately equal to the answer of this June ponder. Why this happens? I don't know. Particularly, I hate this kind of problems.

perucho

## grid

You have a list of items you need to buy today, and you know the locations (represented as points on a cartesian grid) of a few stores in the area. You also know which of these stores are selling each item on your list, and at what price each store sells it. Given the price of gas, what is the minimum amount you need to spend in order to buy all the items on your shopping list and then drive back home? You start and end the journey at your house, which is located at (0,0).

To make matters interesting, some of the items on your list may be perishable. Whenever you make a purchase that includes one or more perishable items, you cannot drive to another store without first stopping back at your house. Every item on your shopping list is guaranteed to be sold by at least one store, so the trip will always be possible.

## Exciting Problem

There is a very exciting problem at http://projecteuler.net/index.php?section=problems&id=202

Is any one interested in discussing that?

## Re: Exciting Problem

I had no patience to compute the exact value, but I tell you what I did.

We extend the triangle to the lattice generated by the vectors u:=C-->A and v:=C-->B. We color the lattice with the letters A, B and C in an proper way. Our solution correspond to a line segment that links the zero vector to another vector w colored by C and with positive coordinates (i.e. w=au+bv, a,b>0). This vector should satisfies the following conditions:

(1). a-b=0 mod 3. This means that w=au+bv is colored by C.

(2). a and b are coprimes. This means that the laser doesn't get out before w.

(3). (a-1)+(b-1)+(a+b-1) = 2a+2b-3 = 12017639147. This means that the laser hits the walls 12017639147 times before get out.

Now the condition (3) tells that

(4) a+b=6008819575 (=1mod3).

Thus by (1) we have a=b=2mod3

By (4) we have that a and b are coprimes iff a and a and 6008819575 are coprimes (i.e. a is a multiple of a prime factor of 6008819575).

6008819575=5*5*11*17*23*29*41*47

Now all we have to do is to count the number of the positive integers congruent with 2 mod3 less than 6008819575 and not multiple of 5, 11, 17, 23, 29, 41 or 47.

It is not difficult but it is a lot of calculus for a guy that is good in computer programing.

The best I can do is an estimation

(1/3)*6008819575*(1-1/5)*(1-1/11)*...*(1-1/47)=1209002666.(6)

I guess it's less than that.

## Re: Exciting Problem

PARASHAR:

Looks interesting. Can the laser pass through the infinitesimal gap at any angle?

I think it would be best to work the problem backwards. Start with the exit path, and then start considering the combinations from there.

Is there a proof of the first statement? (There are 80840...) It might be illuminating.

Anyway, just throwing some stuff in there. I'll keep you posted with any results.

-YourInnerNurmo

## Re: Exciting Problem

There is a standard trick for solving all problems of this sort.

Reflect the triangle to obtain a lattice. Any path will then

become a straight line segment and the number of bounces will become

the number of times this segment intersects the lattice. For

simplicity, set up an oblique coordinate system in which a

the point C is the origin and in which A has coordinates (1,0) and

B has coordinates (0,1). Then points of the lattice are points

with integer coordinates, and a line is part of the lattice if and

only if it is of the form x = n, y = n, or x+y = n, where n is an

integer. Reflections of C are points such that x is congruent to

y modulo 3, reflections of A have x congruent to y+1 modulo 3 and

reflections of B have x congruent to y+2 modulo 3.

Then a line from (0.0) to (m,n) will intersect n-1 lines of the form

x=integer, intersect m-1 lines of the form y=integer. and intersect

m+n-1 lines of the form x+y = integer. Hence, there will be a total

of 2m + 2n - 3 bounces. Now your problem reduces to counting

solutions of a simple linear equation: Given an integer b, how

many pairs (x,y) are there suc that x is congruent to y modulo 3

and 2m + 2n = b + 3ã€€ï¼Ÿ

## Re: Exciting Problem

To YourInnerNurmo: I think there is no need of considering angles at those gaps. Anyways, there is no solution available for the first one

To Raymond Puzio: I think you are right. For the last statement "Given an integer b, how many pairs (x,y) are there suc that x is congruent to y modulo 3

and 2m + 2n = b + 3Ã£Â€Â€Ã¯Â¼Å¸"

There are some junk characters present so could you please elaborate that?

## Re: Exciting Problem

>(...)but it is a lot of calculus for a guy that is good in computer programing.

I mean: for a guy that is NOT good in computer programing.

If I made no mistake the answer is 1209002625.

## Re: Exciting Problem

sorry lynx, your answer is not correct. I think we are doing some mistakes in optimising the calculation.

## Pages