Skip to content

Commit

Permalink
Merge branch 'main' of github.com:VirtualPatientEngine/literatureSurv…
Browse files Browse the repository at this point in the history
…ey into chore/README
  • Loading branch information
gurdeep330 committed Dec 25, 2024
2 parents ef0986e + 42f6c20 commit 756b097
Show file tree
Hide file tree
Showing 64 changed files with 1,902 additions and 1,814 deletions.
34 changes: 29 additions & 5 deletions docs/recommendations/06a0ba437d41a7c82c08a9636a4438c1b5031378.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ hide:

<body>
<p>
<i class="footer">This page was last updated on 2024-09-11 16:11:58 UTC</i>
<i class="footer">This page was last updated on 2024-11-11 06:05:45 UTC</i>
</p>

<div class="note info" onclick="startIntro()">
Expand Down Expand Up @@ -50,7 +50,7 @@ hide:
</td>
<td>2023-03-15</td>
<td>PLOS Computational Biology</td>
<td>3</td>
<td>5</td>
<td>55</td>
</tr>

Expand Down Expand Up @@ -110,8 +110,8 @@ hide:
</td>
<td>2022-06-01</td>
<td>Nonlinear Dynamics</td>
<td>29</td>
<td>91</td>
<td>30</td>
<td>92</td>
</tr>

<tr id="Systems biology is a new discipline built upon the premise that an understanding of how cells and organisms carry out their functions cannot be gained by looking at cellular components in isolation. Instead, consideration of the interplay between the parts of systems is indispensable for analyzing, modeling, and predicting systems' behavior. Studying biological processes under this premise, systems biology combines experimental techniques and computational methods in order to construct predictive models. Both in building and utilizing models of biological systems, inverse problems arise at several occasions, for example, (i) when experimental time series and steady state data are used to construct biochemical reaction networks, (ii) when model parameters are identified that capture underlying mechanisms or (iii) when desired qualitative behavior such as bistability or limit cycle oscillations is engineered by proper choices of parameter combinations. In this paper we review principles of the modeling process in systems biology and illustrate the ill-posedness and regularization of parameter identification problems in that context. Furthermore, we discuss the methodology of qualitative inverse problems and demonstrate how sparsity enforcing regularization allows the determination of key reaction mechanisms underlying the qualitative behavior.">
Expand All @@ -126,6 +126,18 @@ hide:
<td>48</td>
</tr>

<tr id="Biological systems exhibit complex dynamics that differential equations can often adeptly represent. Ordinary differential equation models are widespread; until recently their construction has required extensive prior knowledge of the system. Machine learning methods offer alternative means of model construction: differential equation models can be learnt from data via model discovery using sparse identification of nonlinear dynamics (SINDy). However, SINDy struggles with realistic levels of biological noise and does not incorporate prior knowledge of the system. We propose a data-driven framework for model discovery and model selection using hybrid dynamical systems: partial models containing missing terms. Neural networks are used to approximate the unknown dynamics of a system, enabling the denoising the data while simultaneously learning the latent dynamics. Simulations from the fitted neural network are then used to infer models using SINDy. We show, via model selection, that model discovery in SINDy with hybrid dynamical systems outperforms alternative approaches. We find it possible to infer models correctly up to high levels of biological noise of different types. We demonstrate the potential to learn models from sparse, noisy data in application to a canonical cell state transition using data derived from single-cell transcriptomics. Overall, this approach provides a practical framework for model discovery in biology in cases where data are noisy and sparse, of particular utility when the underlying biological mechanisms are partially but incompletely known.">
<td id="tag"><i class="material-icons">visibility_off</i></td>
<td><a href="https://www.semanticscholar.org/paper/596f63cbf87cf14a0e7859c17058a82b15c760ef" target='_blank'>Data-driven model discovery and model selection for noisy biological systems</a></td>
<td>
Xiaojun Wu, MeiLu McDermott, Adam L. Maclean
</td>
<td>2024-10-04</td>
<td>bioRxiv</td>
<td>0</td>
<td>19</td>
</tr>

<tr id="None">
<td id="tag"><i class="material-icons">visibility_off</i></td>
<td><a href="https://www.semanticscholar.org/paper/1626b462b65d4084a24fb31c4e4ca3fc212a307a" target='_blank'>Set-base dynamical parameter estimation and model invalidation for biochemical reaction networks</a></td>
Expand All @@ -135,7 +147,19 @@ hide:
<td>2010-05-25</td>
<td>BMC Systems Biology</td>
<td>67</td>
<td>47</td>
<td>48</td>
</tr>

<tr id="The combination of machine learning (ML) and sparsity-promoting techniques is enabling direct extraction of governing equations from data, revolutionizing computational modeling in diverse fields of science and engineering. The discovered dynamical models could be used to address challenges in climate science, neuroscience, ecology, finance, epidemiology, and beyond. However, most existing sparse identification methods for discovering dynamical systems treat the whole system as one without considering the interactions between subsystems. As a result, such models are not able to capture small changes in the emergent system behavior. To address this issue, we developed a new method called Sparse Identification of Nonlinear Dynamical Systems from Graph-structured data (SINDyG), which incorporates the network structure into sparse regression to identify model parameters that explain the underlying network dynamics. SINDyG discovers the governing equations of network dynamics while offering improvements in accuracy and model simplicity.">
<td id="tag"><i class="material-icons">visibility_off</i></td>
<td><a href="https://www.semanticscholar.org/paper/4da518ae886702aed27abdd8f94fea3042ab9377" target='_blank'>Discovering Governing equations from Graph-Structured Data by Sparse Identification of Nonlinear Dynamical Systems</a></td>
<td>
Mohammad Amin Basiri, Sina Khanmohammadi
</td>
<td>2024-09-02</td>
<td>ArXiv</td>
<td>0</td>
<td>3</td>
</tr>

</tbody>
Expand Down
20 changes: 10 additions & 10 deletions docs/recommendations/0acd117521ef5aafb09fed02ab415523b330b058.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ hide:

<body>
<p>
<i class="footer">This page was last updated on 2024-09-11 16:12:06 UTC</i>
<i class="footer">This page was last updated on 2024-11-11 06:05:51 UTC</i>
</p>

<div class="note info" onclick="startIntro()">
Expand Down Expand Up @@ -50,7 +50,7 @@ hide:
</td>
<td>2023-11-01</td>
<td>Chaos</td>
<td>3</td>
<td>4</td>
<td>11</td>
</tr>

Expand All @@ -74,8 +74,8 @@ hide:
</td>
<td>2015-09-11</td>
<td>Proceedings of the National Academy of Sciences</td>
<td>3256</td>
<td>65</td>
<td>3408</td>
<td>67</td>
</tr>

<tr id="None">
Expand All @@ -86,7 +86,7 @@ hide:
</td>
<td>2024-01-09</td>
<td>Communications Physics</td>
<td>8</td>
<td>11</td>
<td>2</td>
</tr>

Expand All @@ -98,8 +98,8 @@ hide:
</td>
<td>2020-05-05</td>
<td>Nature Communications</td>
<td>250</td>
<td>12</td>
<td>289</td>
<td>13</td>
</tr>

<tr id="Discovering the partial differential equations underlying spatio-temporal datasets from very limited and highly noisy observations is of paramount interest in many scientific fields. However, it remains an open question to know when model discovery algorithms based on sparse regression can actually recover the underlying physical processes. In this work, we show the design matrices used to infer the equations by sparse regression can violate the irrepresentability condition (IRC) of the Lasso, even when derived from analytical PDE solutions (i.e. without additional noise). Sparse regression techniques which can recover the true underlying model under violated IRC conditions are therefore required, leading to the introduction of the randomised adaptive Lasso. We show once the latter is integrated within the deep learning model discovery framework DeepMod, a wide variety of nonlinear and chaotic canonical PDEs can be recovered: (1) up to $\mathcal{O}(2)$ higher noise-to-sample ratios than state-of-the-art algorithms, (2) with a single set of hyperparameters, which paves the road towards truly automated model discovery.">
Expand All @@ -111,7 +111,7 @@ hide:
<td>2021-06-22</td>
<td>ArXiv</td>
<td>1</td>
<td>12</td>
<td>13</td>
</tr>

<tr id="We investigate the problem of learning an evolution equation directly from some given data. This work develops a learning algorithm to identify the terms in the underlying partial differential equations and to approximate the coefficients of the terms only using data. The algorithm uses sparse optimization in order to perform feature selection and parameter estimation. The features are data driven in the sense that they are constructed using nonlinear algebraic equations on the spatial derivatives of the data. Several numerical experiments show the proposed method's robustness to data noise and size, its ability to capture the true features of the data, and its capability of performing additional analytics. Examples include shock equations, pattern formation, fluid flow and turbulence, and oscillatory convection.">
Expand All @@ -123,7 +123,7 @@ hide:
<td>2017-01-16</td>
<td></td>
<td>0</td>
<td>15</td>
<td>16</td>
</tr>

<tr id="We propose a regression method based upon group sparsity that is capable of discovering parametrized governing dynamical equations of motion of a given system by time series measurements. The method balances model complexity and regression accuracy by selecting a parsimonious model via Pareto analysis. This gives a promising new technique for disambiguating governing equations from simple parametric dependencies in physical, biological and engineering systems.">
Expand All @@ -135,7 +135,7 @@ hide:
<td>2017-12-01</td>
<td>2017 IEEE 7th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP)</td>
<td>12</td>
<td>65</td>
<td>67</td>
</tr>

</tbody>
Expand Down
20 changes: 10 additions & 10 deletions docs/recommendations/0d01d21137a5af9f04e4b16a55a0f732cb8a540b.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ hide:

<body>
<p>
<i class="footer">This page was last updated on 2024-09-11 16:11:15 UTC</i>
<i class="footer">This page was last updated on 2024-11-11 06:05:10 UTC</i>
</p>

<div class="note info" onclick="startIntro()">
Expand Down Expand Up @@ -50,7 +50,7 @@ hide:
</td>
<td>2023-10-24</td>
<td>ArXiv</td>
<td>6</td>
<td>9</td>
<td>50</td>
</tr>

Expand All @@ -74,8 +74,8 @@ hide:
</td>
<td>2023-07-27</td>
<td>ArXiv</td>
<td>0</td>
<td>54</td>
<td>1</td>
<td>55</td>
</tr>

<tr id="Accurate forecasting of multivariate time series is an extensively studied subject in finance, transportation, and computer science. Fully mining the correlation and causation between the variables in a multivariate time series exhibits noticeable results in improving the performance of a time series model. Recently, some models have explored the dependencies between variables through end-to-end graph structure learning without the need for predefined graphs. However, current models do not incorporate the trade-off between efficiency and flexibility and lack the guidance of domain knowledge in the design of graph structure learning algorithms. This paper alleviates the above issues by proposing Balanced Graph Structure Learning for Forecasting (BGSLF), a novel deep learning model that joins graph structure learning and forecasting. Technically, BGSLF leverages the spatial information into convolutional operations and extracts temporal dynamics using the diffusion convolutional recurrent network. The proposed framework balance the trade-off between efficiency and flexibility by introducing Multi-Graph Generation Network (MGN) and Graph Selection Module. In addition, a method named Smooth Sparse Unit (SSU) is designed to sparse the learned graph structures, which conforms to the sparse spatial correlations in the real world. Extensive experiments on four real-world datasets demonstrate that our model achieves state-of-the-art performances with minor trainable parameters. Code will be made publicly available.">
Expand All @@ -87,7 +87,7 @@ hide:
<td>2022-01-24</td>
<td>ArXiv</td>
<td>0</td>
<td>10</td>
<td>11</td>
</tr>

<tr id="We study a recent class of models which uses graph neural networks (GNNs) to improve forecasting in multivariate time series. The core assumption behind these models is that there is a latent graph between the time series (nodes) that governs the evolution of the multivariate time series. By parameterizing a graph in a differentiable way, the models aim to improve forecasting quality. We compare four recent models of this class on the forecasting task. Further, we perform ablations to study their behavior under changing conditions, e.g., when disabling the graph-learning modules and providing the ground-truth relations instead. Based on our findings, we propose novel ways of combining the existing architectures.">
Expand All @@ -99,7 +99,7 @@ hide:
<td>2021-09-10</td>
<td>ArXiv</td>
<td>12</td>
<td>46</td>
<td>47</td>
</tr>

<tr id="Multivariate time-series forecasting is a critical task for many applications, and graph time-series network is widely studied due to its capability to capture the spatial-temporal correlation simultaneously. However, most existing works focus more on learning with the explicit prior graph structure, while ignoring potential information from the implicit graph structure, yielding incomplete structure modeling. Some recent works attempts to learn the intrinsic or implicit graph structure directly, while lacking a way to combine explicit prior structure with implicit structure together. In this paper, we propose Regularized Graph Structure Learning (RGSL) model to incorporate both explicit prior structure and implicit structure together, and learn the forecasting deep networks along with the graph structure. RGSL consists of two innovative modules. First, we derive an implicit dense similarity matrix through node embedding, and learn the sparse graph structure using the Regularized Graph Generation (RGG) based on the Gumbel Softmax trick. Second, we propose a Laplacian Matrix Mixed-up Module (LM3) to fuse the explicit graph and implicit graph together. We conduct experiments on three real-word datasets. Results show that the proposed RGSL model outperforms existing graph forecasting algorithms with a notable margin, while learning meaningful graph structure simultaneously. Our code and models are made publicly available at https://github.com/alipay/RGSL.git.">
Expand All @@ -110,8 +110,8 @@ hide:
</td>
<td>2022-07-01</td>
<td>ArXiv, DBLP</td>
<td>40</td>
<td>34</td>
<td>44</td>
<td>35</td>
</tr>

<tr id="Time series forecasting is an extensively studied subject in statistics, economics, and computer science. Exploration of the correlation and causation among the variables in a multivariate time series shows promise in enhancing the performance of a time series model. When using deep neural networks as forecasting models, we hypothesize that exploiting the pairwise information among multiple (multivariate) time series also improves their forecast. If an explicit graph structure is known, graph neural networks (GNNs) have been demonstrated as powerful tools to exploit the structure. In this work, we propose learning the structure simultaneously with the GNN if the graph is unknown. We cast the problem as learning a probabilistic graph model through optimizing the mean performance over the graph distribution. The distribution is parameterized by a neural network so that discrete graphs can be sampled differentiably through reparameterization. Empirical evaluations show that our method is simpler, more efficient, and better performing than a recently proposed bilevel learning approach for graph structure learning, as well as a broad array of forecasting models, either deep or non-deep learning based, and graph or non-graph based.">
Expand All @@ -122,8 +122,8 @@ hide:
</td>
<td>2021-01-18</td>
<td>ArXiv</td>
<td>179</td>
<td>37</td>
<td>199</td>
<td>38</td>
</tr>

<tr id="Multi-variate time series forecasting is an important problem with a wide range of applications. Recent works model the relations between time-series as graphs and have shown that propagating information over the relation graph can improve time series forecasting. However, in many cases, relational information is not available or is noisy and reliable. Moreover, most works ignore the underlying uncertainty of time-series both for structure learning and deriving the forecasts resulting in the structure not capturing the uncertainty resulting in forecast distributions with poor uncertainty estimates. We tackle this challenge and introduce STOIC, that leverages stochastic correlations between time-series to learn underlying structure between time-series and to provide well-calibrated and accurate forecasts. Over a wide-range of benchmark datasets STOIC provides around 16% more accurate and 14% better-calibrated forecasts. STOIC also shows better adaptation to noise in data during inference and captures important and useful relational information in various benchmarks.">
Expand Down
Loading

0 comments on commit 756b097

Please sign in to comment.