Skip to content

Commit

Permalink
update
Browse files Browse the repository at this point in the history
  • Loading branch information
alexjungaalto committed Dec 22, 2023
1 parent 1f0e184 commit 29b23e3
Show file tree
Hide file tree
Showing 3 changed files with 125 additions and 33 deletions.
70 changes: 66 additions & 4 deletions AoFAaltoCS.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 4,
"id": "6142febe",
"metadata": {},
"outputs": [
Expand All @@ -77,7 +77,69 @@
"Eero Hyvönen\n",
"Perttu Hämäläinen\n",
"Alex Jung\n",
"Juho Kannala\n"
"Juho Kannala\n",
"Petteri Kaski\n",
"Samuel Kaski\n",
"Sándor Kisfaludi-Bak\n",
"Maarit Korpi-Lagg\n",
"Juhi Kulshrestha\n",
"Russell W. F. Lai\n",
"Jouko Lampinen\n",
"Casper Lassenius\n",
"Jaakko Lehtinen\n",
"Janne Lindqvist\n",
"Harri Lähdesmäki\n",
"Lauri Malmi\n",
"Heikki Mannila\n",
"Pekka Marttinen\n",
"Ilkka Niemelä\n",
"Marko Nieminen\n",
"Pekka Orponen\n",
"Alexandru Paler\n",
"Jussi Rintanen\n",
"Juho Rousu\n",
"Jari Saramäki\n",
"Arno Solin\n",
"Jukka Suomela\n",
"Linh Truong\n",
"Jara Joel Olavi Uitto\n",
"Aki Vehtari\n",
"Johanna Viitanen\n",
"Petri Vuorimaa\n",
"Robin Welsch\n",
"Antti Ylä-Jääski\n",
"Bo Zhao\n",
"Mikko Kiviharju\n",
"Tero Ilmari Ojanperä\n",
"Nitin Sawhney\n",
"Talayeh Aledavood\n",
"Lachlan Gunn\n",
"Lassi Haaranen\n",
"Arto Hellas\n",
"Vesa Hirvisalo\n",
"Jaakko Hollmen\n",
"Wilhelmiina Hämäläinen\n",
"Tommi Junttila\n",
"Barbara Esther Keller\n",
"Ari Korhonen\n",
"Sari Kujala\n",
"Jorma Laaksonen\n",
"Riku Linna\n",
"Mika P. Nieminen\n",
"Kerttu Pollari-Malmi\n",
"Risto Sarvas\n",
"Otto Seppälä\n",
"Juha Sorva\n",
"Sanna Suoranta\n",
"Jari-Pekka Vanhanen\n",
"N Asokan\n",
"Jari Collin\n",
"Aristides Gionis\n",
"Petri Myllymäki\n",
"Marko Turpeinen\n",
"Tapio Lokki\n",
"Mikko Sams\n",
"Simo Särkkä\n"
]
}
],
Expand Down Expand Up @@ -116,7 +178,7 @@
},
{
"cell_type": "code",
"execution_count": 9,
"execution_count": 5,
"id": "8c9e21c9",
"metadata": {},
"outputs": [
Expand All @@ -126,7 +188,7 @@
"Text(0.5, 1.0, 'Distribution of Tax-Payer Money via Research Council of Finland')"
]
},
"execution_count": 9,
"execution_count": 5,
"metadata": {},
"output_type": "execute_result"
},
Expand Down
87 changes: 58 additions & 29 deletions index.html
Original file line number Diff line number Diff line change
Expand Up @@ -26,21 +26,32 @@ <h1>Alexander Jung</h1>
<table class="imgtable"><tr><td>
<a href="https://github.com/alexjungaalto/MachineLearningTheBasics/blob/master/MLBasicsBook.pdf"><img src="MLBook.png" alt="alt text" width="200px" /></a>&nbsp;</td>
<td align="left"><ul>
<li><p>Dipl.-Ing. Dr. techn. (<a href="https://en.wikipedia.org/wiki/Sub_auspiciis_Praesidentis" target="_blank">"sub auspiciis"</a>) </p>
<li><p>Dipl.-Ing. Dr. techn. (<a href="https://en.wikipedia.org/wiki/Sub<u>auspiciis</u>Praesidentis" target="_blank">"sub auspiciis"</a>)
</p>
</li>
<li><p>Associate Professor (tenured) for Machine Learning, Aalto University<br /> </p>
<li><p>Associate Professor (tenured) for Machine Learning, Aalto University<br />
</p>
</li>
<li><p>Associate Editor for IEEE Signal Processing Letters (<a href="https://signalprocessingsociety.org/publications-resources/ieee-signal-processing-letters/editorial-board" target="_blank">website</a>) <br /></p>
<li><p>Associate Editor for IEEE Signal Processing Letters (<a href="https://signalprocessingsociety.org/publications-resources/ieee-signal-processing-letters/editorial-board" target="_blank">website</a>) <br />
</p>
</li>
<li><p>Editorial Board Member, &ldquo;Machine Learning&rdquo; (Springer) (<a href="https://www.springer.com/journal/10994/editors" target="_blank">website</a>) <br /></p>
<li><p>Editorial Board Member, &ldquo;Machine Learning&rdquo; (Springer) (<a href="https://www.springer.com/journal/10994/editors" target="_blank">website</a>) <br />
</p>
</li>
<li><p>Follow me on <a href="https://www.linkedin.com/in/aljung/" target="_blank">LinkedIn</a></p>
<li><p>Follow me on <a href="https://www.linkedin.com/in/aljung/" target="_blank">LinkedIn</a>
</p>
</li>
<li><p>Subscribe to my <a href="https://www.youtube.com/channel/UC_tW4Z_GfJ2WCnKDtwMuDUA" target="_blank">Machine Learning YouTube</a> channel</p>
<li><p>Subscribe to my <a href="https://www.youtube.com/channel/UC<u>tW4Z</u>GfJ2WCnKDtwMuDUA" target="_blank">Machine Learning YouTube</a> channel
</p>
</li>
<li><p>Fork me on <a href="https://github.com/alexjungaalto" target="_blank">GitHub</a></p>
<li><p>Fork me on <a href="https://github.com/alexjungaalto" target="_blank">GitHub</a>
</p>
</li>
<li><p>Textbook: &ldquo;Machine Learning - The Basics&rdquo;, Springer, 2022 (<a href="http://mlbook.cs.aalto.fi" target="_blank">"draft"</a> ) <br /></p>
<li><p>Textbook: &ldquo;Machine Learning - The Basics&rdquo;, Springer, 2022 (<a href="http://mlbook.cs.aalto.fi" target="_blank">"draft"</a> ) <br />
</p>
</li>
<li><p>&lt;a href=&ldquo;https:<i></i>www.researchgate.net<i>profile</i>Alexander-Jung&rdquo;&gt;Alexander Jung on ResearchGate&lt;/a&gt;
</p>
</li>
</ul>
</td></tr></table>
Expand All @@ -49,19 +60,23 @@ <h2>About Me</h2>
in 2008 and 2012, respectively. Currently, I am an Associate Professor (tenured) for Machine Learning at the Department of
Computer Science of Aalto University. My research and teaching revolves around the mathematical foundations of trustworthy
machine learning with an emphasis on application domains that generate networked data. These application domains
include numerical weather prediction, renewable energy networks, city planning or condition monitoring. </p>
include numerical weather prediction, renewable energy networks, city planning or condition monitoring.
</p>
<h2>Updates </h2>
<ul>
<li><p>New preprint on &ldquo;Towards Model-Agnostic Federated Learning over Networks&rdquo; available. <a href="https://arxiv.org/abs/2302.04363" target="_blank">click me</a> </p>
<li><p>New preprint on &ldquo;Towards Model-Agnostic Federated Learning over Networks&rdquo; available. <a href="https://arxiv.org/abs/2302.04363" target="_blank">click me</a>
</p>
</li>
</ul>
<h2>For (Prospective) Master Students </h2>
<ul>
<li><p>I have prepared a starter kit for master thesis workers <a href="https://github.com/alexjungaalto/masterthesis" target="_blank">here</a> </p>
<li><p>I have prepared a starter kit for master thesis workers <a href="https://github.com/alexjungaalto/masterthesis" target="_blank">here</a>
</p>
</li>
</ul>
<ul>
<li><p>I have started to share recordinds of our group meetings on my Youtube channel. <a href="https://youtube.com/playlist?list=PLrbn2dGrLJK8wsi_vpr94Gzas7TzUsFNh" target="_blank">Playlist</a> </p>
<li><p>I have started to share recordinds of our group meetings on my Youtube channel. <a href="https://youtube.com/playlist?list=PLrbn2dGrLJK8wsi<u>vpr94Gzas7TzUsFNh" target="</u>blank">Playlist</a>
</p>
</li>
</ul>
<h2>Research Highlight: Computational and Statistical Aspects of Total Variation Minimization for Federated Learning</h2>
Expand All @@ -71,22 +86,29 @@ <h2>Research Highlight: Computational and Statistical Aspects of Total Variation
parameters. The statistical properties of local datasets are related via different network structures that reflect physical (&ldquo;contact networks&rdquo;),
social or biological proximity. In general, local datasets are heterogeneous in the sense of having different statistical distributions.
However, we can often approximate local datasets that form a tight-knit cluster by a common cluster-specific distribution.
<br /> </p>
<br />
</p>
<p>To capitalize on the information in local datasets and their network structure, we have recently proposed networked exponential
families as a novel probabilistic model for big data over networks. Networked exponential families are appealing statistically and
computationally. They allow us to adaptively pool local datasets with similar statistical properties as training sets to learn personalized
predictions tailored to each local dataset. We can compute these personalized predictions using highly scalable distributed
convex optimization methods. These methods are robust against various types of imperfections (statistically and computationally)
and typically offer a high level of privacy protection. </p>
<p><b>Relevant Publications:</b></p>
and typically offer a high level of privacy protection.
</p>
<p><b>Relevant Publications:</b>
</p>
<ul>
<li><p>A. Jung, &ldquo;On the Duality Between Network Flows and Network Lasso,&rdquo; in IEEE Signal Processing Letters, vol. 27, pp. 940-944, 2020, doi: 10.1109/LSP.2020.2998400.</p>
<li><p>A. Jung, &ldquo;On the Duality Between Network Flows and Network Lasso,&rdquo; in IEEE Signal Processing Letters, vol. 27, pp. 940-944, 2020, doi: 10.1109/LSP.2020.2998400.
</p>
</li>
<li><p>A. Jung, &ldquo;Networked Exponential Families for Big Data Over Networks,&rdquo; in IEEE Access, vol. 8, pp. 202897-202909, 2020, doi: 10.1109/ACCESS.2020.3033817.</p>
<li><p>A. Jung, &ldquo;Networked Exponential Families for Big Data Over Networks,&rdquo; in IEEE Access, vol. 8, pp. 202897-202909, 2020, doi: 10.1109/ACCESS.2020.3033817.
</p>
</li>
<li><p>A. Jung, A. O. Hero, III, A. C. Mara, S. Jahromi, A. Heimowitz and Y. C. Eldar, &ldquo;Semi-Supervised Learning in Network-Structured Data via Total Variation Minimization,&rdquo; in IEEE Transactions on Signal Processing, vol. 67, no. 24, pp. 6256-6269, Dec., 2019, doi: 10.1109/TSP.2019.2953593.</p>
<li><p>A. Jung, A. O. Hero, III, A. C. Mara, S. Jahromi, A. Heimowitz and Y. C. Eldar, &ldquo;Semi-Supervised Learning in Network-Structured Data via Total Variation Minimization,&rdquo; in IEEE Transactions on Signal Processing, vol. 67, no. 24, pp. 6256-6269, Dec., 2019, doi: 10.1109/TSP.2019.2953593.
</p>
</li>
<li><p>A. Jung and N. Tran, &ldquo;Localized Linear Regression in Networked Data,&rdquo; in IEEE Signal Processing Letters, vol. 26, no. 7, pp. 1090-1094, July 2019, doi: 10.1109/LSP.2019.2918933.</p>
<li><p>A. Jung and N. Tran, &ldquo;Localized Linear Regression in Networked Data,&rdquo; in IEEE Signal Processing Letters, vol. 26, no. 7, pp. 1090-1094, July 2019, doi: 10.1109/LSP.2019.2918933.
</p>
</li>
</ul>
<h2>Research Highlight: Personalized Explainable Machine Learning </h2>
Expand All @@ -96,33 +118,40 @@ <h2>Research Highlight: Personalized Explainable Machine Learning </h2>
entropy of the prediction given the summary that a particular user associates with data points. The user summary is
used to characterise the background knowledge of the &ldquo;explainee&rdquo; in order to compute explanations that are tailored for her.
To compute the explanations our method only requires some training samples that consists of data points and their corresponding
predictions and user summaries. Thus, our method is model agnostic and can be used to compute explanations for different machine learning methods. </p>
<p><b>Relevant Publications:</b></p>
predictions and user summaries. Thus, our method is model agnostic and can be used to compute explanations for different machine learning methods.
</p>
<p><b>Relevant Publications:</b>
</p>
<ul>
<li><p>A. Jung, “Explainable Empirical Risk Minimization”, arXiv eprint, 2020. <a href="https://arxiv.org/abs/2009.01492">weblink</a></p>
<li><p>A. Jung, “Explainable Empirical Risk Minimization”, arXiv eprint, 2020. <a href="https://arxiv.org/abs/2009.01492" target=&ldquo;blank&rdquo;>weblink</a>
</p>
</li>
<li><p>A. Jung and P. H. J. Nardelli, &ldquo;An Information-Theoretic Approach to Personalized Explainable Machine Learning,&rdquo; in IEEE Signal Processing Letters, vol. 27, pp. 825-829, 2020, doi: 10.1109/LSP.2020.2993176.</p>
<li><p>A. Jung and P. H. J. Nardelli, &ldquo;An Information-Theoretic Approach to Personalized Explainable Machine Learning,&rdquo; in IEEE Signal Processing Letters, vol. 27, pp. 825-829, 2020, doi: 10.1109/LSP.2020.2993176.
</p>
</li>
</ul>
<h2>Teaching Highlight: Student Feedback-Driven Course Development</h2>
<p><img src="ThreeCompCycle.png" width="500" align="left" /> Right from my start at Aalto in 2015, I took care of the main machine
learning courses at Aalto University. Within three years I have re-designed the spearhead course Machine Learning: Basic Principles (MLBP).
This re-design was based on a careful analysis of feedback received from several thousands of students. I have also started to
prepare <a href="mlcourseresponse.pdf" target="_blank">response letters</a> to the student feedback, as it is customary in the
prepare <a href="mlcourseresponse.pdf" target="<u>blank">response letters</a> to the student feedback, as it is customary in the
review process of scientific journals. My final edition of MLBP in 2018 has achieved the best student rating since the course was
established at Aalto. The efforts have also been acknowledged by the <a href="TeacherAward.png" target="_blank">Teacher of the Year</a>
award, which I have received in 2018 from the Department of Computer Science at Aalto University.</p>
established at Aalto. The efforts have also been acknowledged by the <a href="TeacherAward.png" target="</u>blank">Teacher of the Year</a>
award, which I have received in 2018 from the Department of Computer Science at Aalto University.
</p>
<h2>Teaching Highlight: A Three-Component Picture of Machine Learning</h2>
<p>Machine learning methods have been and are currently popularized in virtually any field of science and technology. As a result, machine learning
courses attract students from different study programs. Thus, a key challenge in teaching basic machine learning courses is the heterogeneity
of student backgrounds. To cope with this challenge, I have developed a new teaching concept for machine learning. This teaching concept
revolves around three main components of machine learning: data, models and loss functions. By decomposing every machine learning methods
into specific design choices for data representation, model and loss function, students learn to navigate the vast landscape of machine learning
methods and applications. The three-component picture of machine learning is the main subject of my textbook <a href="https://github.com/alexjungaalto/MachineLearningTheBasics/blob/master/MLBasicsBook.pdf" target="_blank">Machine Learning: The Basics</a>. </p>
<p><p><img src="MLLandscape.png" width="500" align="bottom" /> </p></p>
methods and applications. The three-component picture of machine learning is the main subject of my textbook <a href="https://github.com/alexjungaalto/MachineLearningTheBasics/blob/master/MLBasicsBook.pdf" target="_blank">Machine Learning: The Basics</a>.
</p>
<p><p><img src="MLLandscape.png" width="500" align="bottom" /> </p>
</p>
<div id="footer">
<div id="footer-text">
Page generated 2023-09-07 19:50:25 EEST, by <a href="http://jemdoc.jaboc.net/">jemdoc</a>.
Page generated 2023-12-22 14:53:09 EET, by <a href="https://github.com/wsshin/jemdoc_mathjax" target="blank">jemdoc+MathJax</a>.
</div>
</div>
</td>
Expand Down
1 change: 1 addition & 0 deletions index.jemdoc
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@
- Subscribe to my {{<a href="https://www.youtube.com/channel/UC_tW4Z_GfJ2WCnKDtwMuDUA" target="_blank">Machine Learning YouTube</a>}} channel
- Fork me on {{<a href="https://github.com/alexjungaalto" target="_blank">GitHub</a>}}
- Textbook: "Machine Learning - The Basics", Springer, 2022 ({{<a href="http://mlbook.cs.aalto.fi" target="_blank">"draft"</a>}} ) \n
- <a href="https://www.researchgate.net/profile/Alexander-Jung">Alexander Jung on ResearchGate</a>
~~~

== About Me
Expand Down

0 comments on commit 29b23e3

Please sign in to comment.