Velocity Reviews > numpy (matrix solver) - python vs. matlab

# numpy (matrix solver) - python vs. matlab

someone
Guest
Posts: n/a

 05-01-2012
On 05/01/2012 08:56 AM, Russ P. wrote:
> On Apr 29, 5:17 pm, someone<newsbo...@gmail.com> wrote:
>> On 04/30/2012 12:39 AM, Kiuhnm wrote:
>>> You should try to avoid matrix inversion altogether if that's the case.
>>> For instance you shouldn't invert a matrix just to solve a linear system.

>>
>> What then?
>>
>> Cramer's rule?

>
> If you really want to know just about everything there is to know
> about a matrix, take a look at its Singular Value Decomposition (SVD).

I know a bit about SVD - I used it for a short period of time in Matlab,
though I'm definately not an expert in it and I don't understand the
whole theory with orthogality behind making it work so elegant as it
is/does work out.

> I've never used numpy, but I assume it can compute an SVD.

I'm making my first steps now with numpy, so there's a lot I don't know
and haven't tried with numpy...

someone
Guest
Posts: n/a

 05-01-2012
On 04/30/2012 03:35 AM, Nasser M. Abbasi wrote:
> On 04/29/2012 07:59 PM, someone wrote:
> I do not use python much myself, but a quick google showed that pyhton
> scipy has API for linalg, so use, which is from the documentation, the
> following code example
>
> X = scipy.linalg.solve(A, B)
>
> But you still need to check the cond(). If it is too large, not good.
> How large and all that, depends on the problem itself. But the rule of
> thumb, the lower the better. Less than 100 can be good in general, but I
> really can't give you a fixed number to use, as I am not an expert in
> this subjects, others who know more about it might have better
> recommendations.

Ok, that's a number...

Anyone wants to participate and do I hear something better than "less
than 100 can be good in general" ?

If I don't hear anything better, the limit is now 100...

What's the limit in matlab (on the condition number of the matrices), by
the way, before it comes up with a warning ???

Colin J. Williams
Guest
Posts: n/a

 05-01-2012
On 01/05/2012 2:43 PM, someone wrote:
[snip]
>> a = [1 2 3]; b = [11 12 13]; c = [21 22 23].
>>
>> Then notice that c = 2*b - a. So c is linearly dependent on a and b.
>> Geometrically this means the three vectors are in the same plane,
>> so the matrix doesn't have an inverse.

>

Does it not mean that there are three parallel planes?

Consider the example in two dimensional space.

Colin W.
[snip]

Russ P.
Guest
Posts: n/a

 05-01-2012
On May 1, 11:52*am, someone <newsbo...@gmail.com> wrote:
> On 04/30/2012 03:35 AM, Nasser M. Abbasi wrote:
>
> > On 04/29/2012 07:59 PM, someone wrote:
> > I do not use python much myself, but a quick google showed that pyhton
> > scipy has API for linalg, so use, which is from the documentation, the
> > following code example

>
> > X = scipy.linalg.solve(A, B)

>
> > But you still need to check the cond(). If it is too large, not good.
> > How large and all that, depends on the problem itself. But the rule of
> > thumb, the lower the better. Less than 100 can be good in general, but I
> > really can't give you a fixed number to use, as I am not an expert in
> > this subjects, others who know more about it might have better
> > recommendations.

>
> Ok, that's a number...
>
> Anyone wants to participate and do I hear something better than "less
> than 100 can be good in general" ?
>
> If I don't hear anything better, the limit is now 100...
>
> What's the limit in matlab (on the condition number of the matrices), by
> the way, before it comes up with a warning ???

The threshold of acceptability really depends on the problem you are
trying to solve. I haven't solved linear equations for a long time,
but off hand, I would say that a condition number over 10 is
questionable.

A high condition number suggests that the selection of independent
variables for the linear function you are trying to fit is not quite
right. For a poorly conditioned matrix, your modeling function will be
very sensitive to measurement noise and other sources of error, if
applicable. If the condition number is 100, then any input on one
particular axis gets magnified 100 times more than other inputs.
Unless your inputs are very precise, that is probably not what you
want.

Or something like that.

someone
Guest
Posts: n/a

 05-01-2012
On 05/01/2012 09:59 PM, Colin J. Williams wrote:
> On 01/05/2012 2:43 PM, someone wrote:
> [snip]
>>> a = [1 2 3]; b = [11 12 13]; c = [21 22 23].
>>>
>>> Then notice that c = 2*b - a. So c is linearly dependent on a and b.
>>> Geometrically this means the three vectors are in the same plane,
>>> so the matrix doesn't have an inverse.

>>

>
> Does it not mean that there are three parallel planes?
>
> Consider the example in two dimensional space.

I actually drawed it and saw it... It means that you can construct a 2D
plane and all 3 vectors are in this 2D-plane...

someone
Guest
Posts: n/a

 05-01-2012
On 05/01/2012 10:54 PM, Russ P. wrote:
> On May 1, 11:52 am, someone<newsbo...@gmail.com> wrote:
>> On 04/30/2012 03:35 AM, Nasser M. Abbasi wrote:

>> What's the limit in matlab (on the condition number of the matrices), by
>> the way, before it comes up with a warning ???

>
> The threshold of acceptability really depends on the problem you are
> trying to solve. I haven't solved linear equations for a long time,
> but off hand, I would say that a condition number over 10 is
> questionable.

Anyone knows the threshold for Matlab for warning when solving x=A\b ? I
tried "edit slash" but this seems to be internal so I cannot see what
criteria the warning is based upon...

> A high condition number suggests that the selection of independent
> variables for the linear function you are trying to fit is not quite
> right. For a poorly conditioned matrix, your modeling function will be
> very sensitive to measurement noise and other sources of error, if
> applicable. If the condition number is 100, then any input on one
> particular axis gets magnified 100 times more than other inputs.
> Unless your inputs are very precise, that is probably not what you
> want.
>
> Or something like that.

Ok. So it's like a frequency-response-function, output divided by input...

Paul Rubin
Guest
Posts: n/a

 05-01-2012
someone <> writes:
> Actually I know some... I just didn't think so much about, before
> writing the question this as I should, I know theres also something
> like singular value decomposition that I think can help solve
> otherwise illposed problems,

You will probably get better advice if you are able to describe what
problem (ill-posed or otherwise) you are actually trying to solve. SVD
just separates out the orthogonal and scaling parts of the
transformation induced by a matrix. Whether that is of any use to you
is unclear since you don't say what you're trying to do.

Russ P.
Guest
Posts: n/a

 05-01-2012
On May 1, 4:05*pm, Paul Rubin <no.em...@nospam.invalid> wrote:
> someone <newsbo...@gmail.com> writes:
> > Actually I know some... I just didn't think so much about, before
> > writing the question this as I should, I know theres also something
> > like singular value decomposition that I think can help solve
> > otherwise illposed problems,

>
> You will probably get better advice if you are able to describe what
> problem (ill-posed or otherwise) you are actually trying to solve. *SVD
> just separates out the orthogonal and scaling parts of the
> transformation induced by a matrix. *Whether that is of any use to you
> is unclear since you don't say what you're trying to do.

I agree with the first sentence, but I take slight issue with the word
"just" in the second. The "orthogonal" part of the transformation is
non-distorting, but the "scaling" part essentially distorts the space.
At least that's how I think about it. The larger the ratio between the
largest and smallest singular value, the more distortion there is. SVD
may or may not be the best choice for the final algorithm, but it is
useful for visualizing the transformation you are applying. It can
provide clues about the quality of the selection of independent
variables, state variables, or inputs.

someone
Guest
Posts: n/a

 05-02-2012
On 05/02/2012 01:05 AM, Paul Rubin wrote:
> someone<> writes:
>> Actually I know some... I just didn't think so much about, before
>> writing the question this as I should, I know theres also something
>> like singular value decomposition that I think can help solve
>> otherwise illposed problems,

>
> You will probably get better advice if you are able to describe what
> problem (ill-posed or otherwise) you are actually trying to solve. SVD

I don't understand what else I should write. I gave the singular matrix
would be nice to learn some things for future use (for instance
understanding SVD more - perhaps someone geometrically can explain SVD,
that'll be really nice, I hope)...

> just separates out the orthogonal and scaling parts of the
> transformation induced by a matrix. Whether that is of any use to you
> is unclear since you don't say what you're trying to do.

Still, I dont think I completely understand SVD. SVD (at least in
Matlab) returns 3 matrices, one is a diagonal matrix I think. I think I
would better understand it with geometric examples, if one would be so
kind to maybe write something about that... I can plot 3D vectors in
matlab, very easily so maybe I better understand SVD if I hear/read the
geometric explanation (references to textbook/similar is also appreciated).

someone
Guest
Posts: n/a

 05-02-2012
On 05/02/2012 01:38 AM, Russ P. wrote:
> On May 1, 4:05 pm, Paul Rubin<no.em...@nospam.invalid> wrote:
>> someone<newsbo...@gmail.com> writes:
>>> Actually I know some... I just didn't think so much about, before
>>> writing the question this as I should, I know theres also something
>>> like singular value decomposition that I think can help solve
>>> otherwise illposed problems,

>>
>> You will probably get better advice if you are able to describe what
>> problem (ill-posed or otherwise) you are actually trying to solve. SVD
>> just separates out the orthogonal and scaling parts of the
>> transformation induced by a matrix. Whether that is of any use to you
>> is unclear since you don't say what you're trying to do.

>
> I agree with the first sentence, but I take slight issue with the word
> "just" in the second. The "orthogonal" part of the transformation is
> non-distorting, but the "scaling" part essentially distorts the space.
> At least that's how I think about it. The larger the ratio between the
> largest and smallest singular value, the more distortion there is. SVD
> may or may not be the best choice for the final algorithm, but it is
> useful for visualizing the transformation you are applying. It can
> provide clues about the quality of the selection of independent
> variables, state variables, or inputs.

Me would like to hear more!

It would really appreciate if anyone could maybe post a simple SVD
example and tell what the vectors from the SVD represents geometrically
/ visually, because I don't understand it good enough and I'm sure it's
very important, when it comes to solving matrix systems...