< Earlier Kibitzing · PAGE 53 OF 57 ·
Later Kibitzing> |
Aug-23-23
 | | alexmagnus: Not PhD, just master's (to be more precise, a diploma from before the bachelor/master system was introduced in Germany but that is equivalent to master). |
|
Aug-23-23 | | Rdb: OK. Thanks for response , <alexmagnus> Regards
2) can I ask you , if it is not intrusive , what career you chose after masters in maths ? Thank you.
Regards |
|
Aug-24-23
 | | alexmagnus: I actually couldn't find any job after my master's because I lacked programming knowledge, and most offers came from IT companies (banks and insurances are another common field for mathematicians but my financial math was not good at all). So I went on to study again, making a bachelor's in computer science. Working as a software developer since. |
|
Aug-25-23 | | Rdb: <alexmagnus: I actually couldn't find any job after my master's because I lacked programming knowledge, and most offers came from IT companies (banks and insurances are another common field for mathematicians but my financial math was not good at all).
So I went on to study again, making a bachelor's in computer science. Working as a software developer since> Thank you for sharing <alexmagnus> Regards. |
|
Aug-27-23 | | Rdb: <alexmagnus> , I have heard that machine learning/artificial intelligence requires deep understanding of all of MSc maths (master level maths ) . I was hoping that you could be in that field and give me some insights about how advanced maths is essential for machine learning/artificial intelligence . Thank you . Regards |
|
Aug-27-23
 | | alexmagnus: < I was hoping that you could be in that field and give me some insights about how advanced maths is essential for machine learning/artificial intelligence .> During my CS study I've had only some basics of neural networks. There the math was advanced but not super advanced, about halfway through the BSc study would be enough. But it was a "basic" NN course (in the end we programmed a robot that could navigate one very specific labyrinth by having learned the path by training its NN), so no idea how it is in case of, say, deep learning. |
|
Aug-28-23 | | Rdb: Thanks a lot for response <alexmagnus> I intend to start studying MSc maths soon . So , that is the context of my interest in this conversation 2) I have something more to say about it and intend to post soon here in this forum of yours that stuff , within next few days (if not today) , if you please. Thank you 🙏🙏 |
|
Sep-01-23
 | | alexmagnus: September:
European Union:
1. Firouzja 2777
2. Giri 2760
3. Rapport 2752
4-5. Vachier-Lagrave 2727
4-5. Topalov 2727
6. Duda 2726
7. Keymer 2717
8. Vallejo 2712
9. Sjugirov 2705
10. Deac 2702
Former Soviet Union:
1. Nepomniachtchi 2771
2. Karjakin 2750
3. Radjabov 2745
4. Mamedyarov 2734
5. Grischuk 2732
6. Abdusattorov 2716
7. Dubov 2710
8. Artemiev 2697
9. Eljanov 2695
10. Tomashevsky 2694
Former British Empire:
1. Caruana 2786
2. Nakamura 2780
3. Gukesh 2758
4. Anand 2754
5. So 2753
6. Dominguez 2745
7. Aronian 2742
8. Praggnanandhaa 2727
9. Vitiugov 2719
10. Vidit 2716
Asia:
1. Ding 2780
2. Gukesh 2758
3. Anand 2754
4. Le 2733
5. Praggnanandhaa 2727
6. Wei 2726
7. Yu 2720
8-9. Vidit 2716
8-9. Abdusattorov 2716
10. Erigaisi 2712
Born later than the world champion:
1. Firouzja 2777
2. Giri 2760
3. Gukesh 2758
4. So 2753
5. Rapport 2752
6. Praggnanandhaa 2727
7-8. Duda 2726
7-8. Wei 2726
9. Yu 2720
10. Keymer 2717
Nuclear powers:
1. Caruana 2786
2-3. Nakamura 2780
2-3. Ding 2780
4. Firouzja 2777
5. Nepomniachtchi 2771
6. Gukesh 2758
7. Anand 2754
8. So 2753
9. Karjakin 2750
10. Dominguez 2745
Old guard:
1. Anand 2754
2. Dominguez 2745
3. Aronian 2742
4. Grischuk 2732
5. Topalov 2727
6. Vallejo 2712
7. Eljanov 2695
8. Sargissian 2692
9. Svidler 2689
10. Kasimdzhanov 2673 |
|
Sep-06-23
 | | alexmagnus: In the beginning of October I'm not here btw, so those few who actually read my lists - it will only be here somewhere mid-October. Even though you can "compose" them yourself, I'll still write it here mid-October for archive purposes. |
|
Sep-07-23
 | | OhioChessFan: I read them, occasionally comment, always appreciate. Thanks for doing it. |
|
Sep-07-23
 | | Check It Out: Hi <alexmagnus> I just noticed your lists, interesting. Do you now mean older than WC Ding Liren? Why do you skip a list with Carlsen? Lumping India and the US as former British colony gave me a chuckle. Anyway, no criticisms, just curious. |
|
Sep-08-23
 | | alexmagnus: Old guard is everyone born in (current year-40) or before, that is, currently 1983 or before. |
|
Sep-09-23
 | | Check It Out: Oh. I meant the list, "born later than the world champion" |
|
Sep-11-23
 | | alexmagnus: That's players younger than the world champion, not older, see the list itself. Carlsen somehow manages to escape all of my lists. He could have ended up on the "born later" list if Nepo had won either of his world championship matches. And Carlsen being Carlsen, he will probably retire before he can make it to the Old Guard list. |
|
Sep-12-23
 | | OhioChessFan: I'm surprised Carlsen continues to play as much as he has. |
|
Sep-23-23 | | Rdb: https://www.quora.com/Why-is-Machin... <It’s safe to say that if you intend to be a creator of ML research rather than a consumer, you better get used to devoting a large part of your life to understanding advanced math. I speak from experience: I’m close to inaugurating my 40th year of my career as an ML researcher and it has only gotten harder to master the necessary math. I don’t mean the kid stuff, like basic linear algebra, multivariate calculus, optimization, probability and statistics. This level of math is simply the price of admission into the field. This gets you into the cheap seats in the ML research stadium. You’ll still struggle to understand the recent advances in the field. You want to really push the frontier in the field? Start getting used to reading graduate level math textbooks. Here’s a short sampling of more advanced math that is increasingly proving essential to many sub fields in modern ML. I’ll briefly explain why each of these subfields is useful, so you’ll get to understand why. Topology: Topology is the intrinsic study of the “abstract” shape of a space. Physicists classify particles. Mathematicians classify spaces. Topology is the tool par excellence for this purpose. Topological data analysis is one of the fastest growing areas and is likely to be widely applied in many areas, from bioinformatics to digital marketing. I find topology to be incredibly elegant: it is beautiful to see concepts like continuity formalized using the topology of open sets. Calculus seems pedestrian and ugly in comparison. Topology is also at the heart of probability and statistics. Finally, the most important theorem in optimization — the Hahn Banach theorem — is a simple application of topology and among its many applications is the duality theory, Lagrange multipliers and the result that feedforward neural nets can represent any smooth function.
Measure theory: one of the bedrocks of machine learning is the idea of computing distances between objects — text documents, images, DNA sequences, probability distributions, and policies in reinforcement learning — and measure theory is the subfield of mathematics that distills the essence of how to define distances. Probability was axiomatized by the Russian mathematician Kolmogorov using measure theory nearly 100 years ago and it forms the foundation of statistics. Probability distributions are defined using Lebesgue measures. The theory of integration in measure theory is the basis for defining expected values of a random variable and for defining conditional expectation and regression.
Combinatorics: one of the oldest fields in discrete math useful in counting the number of objects parameterized by some variable n that usually grows asymptotically, combinatorics yields deep insights into many problems in machine learning. The theory of which functions are learnable in polynomial time, Vapnik-Chervonenkis (VC) theory at the heart of support vector machines, builds on combinatorial results of how many functions can be defined on a space. I recently built on some beautiful results on how many partial orders can be defined on n variables. Amazingly, asymptotically, all partial orders are almost surely comprised of three layers — did you know that?
Category theory: much of the edifice of 20th century mathematics has been built on category theory, which fundamentally changed mathematics from being a study of sets and their elements to the study of morphisms and functors. It is likely that in the next 50 years, some of the most profound advances in machine learning will come from applying category theory, which finds deep unifying structure among otherwise unrelated spaces. It is not inconceivable that deep within the brain, categorical concepts are implemented in some way. Category theory alas is forbiddingly hard, but hey, who said climbing K2 was going to be easy? The most famous mathematician of the 20th century, according to many mathematicians, was Grothendieck, who did much of his work on category theory.
Sheaf theory: the theory of sheaves might also prove extremely useful in machine learning, replacing the current widespread use of manifold learning. Manifolds are a special case of sheaves. What’s a sheaf? Think of a sheaf as a graph that is adorned with some data on its vertices and edges. A weighted graph is a trivial example. Imagine each vertex of the graph having an associated vector space called a stalk, and the edge connecting two vertices as associated with linear transformations that ensures local consistency ....> |
|
Sep-24-23 | | Rdb: https://www.quora.com/How-important... <How important is a strong mathematical background in machine learning?
Mathematics is the language that modern machine learning is built on, so trying to do machine learning without knowledge of math is like trying to play classical music without being able to read scores, i.e. not very well. In essence, machine learning is about extracting structure from data: finding the patterns amongst a great deal of noise. Nothing prepares you better for a lifetime of doing machine learning as a deep knowledge of math. In fact, one can turn this question on its head: often students who took my graduate ML class at UMass came to me and marveled at how they finally understood what their linear algebra or calculus or statistics class they took eons ago was trying to teach them! The deeper one gets into a problem, the more math seems necessary to formulate and solve a machine learning task. Let’s consider an example, to make this more concrete. A fundamental challenge for machine learning is how to transfer the results of learning from one setting to another. Humans seem to be able to do this fairly effortlessly, often to a level that befuddles any machine learning approach. I just returned from a trip to Norway and Sweden. I’d never been to either country before my trip. In essence, I had no training data of actually having been in either country. Did that pose a problem. Not an iota. I just applied the basic knowledge of how to get around cities or airports from past experience. We do this as humans as easily as we breathe. Most machine learning approaches, from au courant deep learning to good old fashioned classical Gaussian least squares regression, will choke if the test data (e.g., taking a train from the airport in Stockholm or Oslo to city center) differs from the training data (e.g., prior experience of taking trains from San Francisco airport to the city center). The reason being that the statistics are entirely different. The distribution of features in Sweden or Norway is very different from San Francisco. So, how does one solve the problem of transfer learning? Let’s imagine a very basic idea of computing some aggregate statistics of the source dataset (e.g., for every feature, compute its correlation against all other features) and doing the same for the target dataset. It might be helpful to recall that the source dataset is labeled but the target dataset is not (that is, from our prior experience, we know what the signs mean in San Francisco since we speak English, but we can't interpret the signs in Stockholm or Oslo since we don't speak Swedish or Norwegian). So, we are in essence computing the covariance matrix of the source dataset and the covariance matrix of the target dataset. This matrix summarizes the second order statistics of the data. Under the simplifying assumption that the second order statistics are all that matter, in other words the distribution of features are Gaussian, we can now proceed to find a solution. The source and covariance matrices are going to be different in transfer learning since the data are coming from two different multivariate Gaussian distributions. So, any approach like using Bayesian ML to find a classifier that labels source data will fail in the target domain since the covariances have changed. Even a method like support vector machines, which finds the best hyperplane that separates positive from negative examples in the source domain cannot be applied in the target domain since the data have shifted unpredictably in the space of instances. OK, here’s a simple trick developed in a popular algorithm called CORAL (for correlational alignment). Find the matrix A such that the source covariance Σs can be transformed into something that matches the target covariance Σt. In the language of linear algebra, this amounts to solving the optimization problem (pardon the Latex): minA∥ATΣsA−Σt∥2
It ...> |
|
Sep-25-23 | | Rdb: Like I had promised , <alexmagnus> Any inputs , if you please ?
Thank you . Regards 🙏 🙏 |
|
Oct-11-23
 | | alexmagnus: No inputs, as my ML background is by far not close enough to discuss the necessity of the more advanced math :). As I said, my entite background in ML/AI are a couple of courses in the bachelor's study. So the math there was advanced undergraduate math, but not graduate level. Like, I can't remember ever needing category theory mentioned in one of the answers (category theory gave me quite some nightmares during the math master's study though :D). |
|
Oct-11-23
 | | alexmagnus: Lists for October, as announced back in September: European Union:
1. Firouzja 2777
2. Giri 2760
3. Rapport 2752
4-5. Topalov 2727
4-5. Vachier-Lagrave 2727
6. Duda 2726
7. Keymer 2717
8. Van Foreest 2707
9. Sjugirov 2705
10. Deac 2701
Former Soviet Union:
1. Nepomniachtchi 2771
2. Karjakin 2750
3. Radjabov 2745
4. Mamedyarov 2734
5. Grischuk 2732
6. Abdusattorov 2716
7. Dubov 2710
8. Artemiev 2697
9. Martirosyan 2696
10. Eljanov 2695
Former British Empire:
1. Caruana 2786
2. Nakamura 2780
3. Gukesh 2758
4. Anand 2754
5. So 2753
6. Dominguez 2745
7. Aronian 2742
8. Praggnanandhaa 2738
9-10. Harikrishna 2716
9-10. Vidit 2716
Asia:
1. Ding 2780
2. Gukesh 2758
3. Anand 2754
4. Praggnanandhaa 2738
5. Le 2733
6. Wei 2726
7. Yu 2720
8-10. Harikrishna 2716
8-10. Abdusattorov 2716
8-10. Vidit 2716
Born later than the world champion:
1. Firouzja 2777
2. Giri 2760
3. Gukesh 2758
4. So 2753
5. Rapport 2752
6. Praggnanadhaa 2738
7-8. Duda 2726
7-8. Wei 2726
9. Yu 2720
10. Keymer 2717
Nuclear powers:
1. Caruana 2786
2-3. Ding 2780
2-3. Nakamura 2780
4. Firouzja 2777
5. Nepomniachtchi 2771
6. Gukesh 2758
7. Anand 2754
8. So 2753
9. Karjakin 2750
10. Dominguez 2745
Old Guard:
1. Anand 2754
2. Dominguez 2745
3. Aronian 2742
4. Grischuk 2732
5. Topalov 2727
6. Eljanov 2695
7. Svidler 2689
8. Sargissian 2686
9. Vallejo 2680
10. Kasimdzhanov 2673 |
|
Oct-12-23
 | | OhioChessFan: Thanks for everything. |
|
Oct-19-23 | | Rdb: Thank you for your inputs about maths , <alexmagnus> 2) a different question , If you please.
Israel - hamas war . I was reminded of your post from many , many years ago where you said (if i recall correctly ) something like "<I, <alexmagnus> , may be considered a Jew according to some criterion ...I have relatives in israel ...I have visited Israel...I find amusing the overcharged intense positions of online supporters of Israel and Palestinians both who have never met Palestinians or israelites in real life ...both Palestinians and israelites are nowhere that intense about these issues "> Correct ? |
|
Oct-20-23
 | | alexmagnus: Well, yes, I still maitain that both the Israelis and the Palestinians are discussing the Conflict much less than the outsiders. Unless it flares up. Things changed over the last few years though when it comes to the Conflict. What happened, in a long process that started probably some 5-6 years ago, is that both Israel and the Palestinians stopped thinking that there are <any> civilians on the other side. The massacre of October 7 and Israel's reaction to it were just the culmination of this "de-civilizaation" of the respective enemy. Now, to an (average) Israeli, every Palestinian is a terrorist, be it a baby. And to an (average) Palestinian, every Israeli is an occupier, be it a baby. This mindset, on both sides, is not born outof nowhere but is the consequence of the actions of each side. Even more, both sides are <interested> in maintaining the respective image in the enemy's eyes. Because both sides want to provoke the otehr side to a war. Especially with the current extreme government in Israel (the Hamas has always been extreme, so on the Palestinian side not much changed). This "there are no civilians" approach explains both things like the October 7 massacre and the treatment of the Palestinians by Israel. Both sides act logically if you consider this view. But both have a flaw even under this barbaric view: the enemy has nowhere to go. Israel considers Palestinians terrorists, so they turned Palestine into a prison. But unlike an actual prison, there is no way out: no parole, no pardon, no release on good behavior, no end of the term. So there is no incentive for Palestinians to give up violence and terrorism, and such treatment only creates more violence and terrorism. Palestine thinks every Israeli is an occupier, and so they deal the way colonized nations deal with occupiers: terrorism. But unlike actual occupiers driven out by terrorism, Israel has nowhere to go. There is no "colonial motherland" here. Israel <is> their home. The Palestinians should adopt peaceful resistance. Apartheid South Africa models their situation mch better than Algeria. And note, by the way, the apartheid in South Africa ended because the whites were <convinced> to end it. Not by the violence in Soweto, but by international sanctions, pressure, and peaceful parts of the black resistance. Eventually it was a white president that intiated the reforms to end the apartheid and a white-only referendum that approved those reforms. Palestinians should take up peaceful resistance and the international community should sanction Israel. In return, Palestinians should <give up> any claim to Israel within the 1967 borders. And be merciless on those who still raise such claims or sabotage peaceful resistance. Israel, in turn, should reward non-violence. And offer a comprehensive deoccupation plan in case the Palestinians hold to non-violence. And stick to that plan, ignoring any internal protests of religious fanatics and "security" freaks, who in reality don't give a damn about security. In the end, there should be two independent, sovereign, free and democratic states. |
|
Oct-20-23 | | Rdb: Thanks for response , <alexmagnus>. |
|
Oct-21-23 | | Rdb: Reostedjn this forum from rogoff forum :
<alexmagnus> says average Israeli considers all Palestinians enemy, be it a baby and average Palestinian considers all Israeli occupier be it a baby .
I tried to be a worthy student of <johnlspouge> and looked for substantiation by goggling like <how many Israeli consider all Palestinians as enemy >. Did not get any relevant result.
So , search is not that simple/easy.
Can somebody provide substantiation? Thank you 🙏🙏 |
|
 |
 |
< Earlier Kibitzing · PAGE 53 OF 57 ·
Later Kibitzing> |
|
|
|