I have used several other notebooks for simulating and learning priors via EB approach. This notebook puts togather what we have.
Here are the weights assigned for this simulation, with 50 conditions.
Here is the EB result (using extreme deconvolution), for FLASH, X'X and PCA:
%preview ../dsc/mnm_prototype/mnm_sumstats/artificial_mixture_eb.png
For GTEx mixture based simulation, I first learned from GTEx V8 data using extreme deconvolution the actual weight and mixture matrices, then use that for simulation. With simulated data I learn again the mixture and their weights using extreme deconvolution. Here is the result for FLASH, PCA and X'X:
%preview ../dsc/mnm_prototype/mnm_sumstats/gtex_mixture_eb.png
Please checkout this page for more details.
%cd ../dsc/mnm_prototype/mnm_20200510
artificial mixture,
%preview PIP_comparison_0510.artificial_mixture.global.pip_evaluation.png
GTEx mixture,
%preview PIP_comparison_0510.gtex_mixture.global.pip_evaluation.png
artificial mixture,
%preview PIP_comparison_0510.artificial_mixture.global.roc.pdf -s png
GTEx mixture,
%preview PIP_comparison_0510.gtex_mixture.global.roc.pdf -s png
artificial mixture,
%preview PIP_comparison_0510.artificial_mixture.global.pr.pdf -s png
%preview PIP_comparison_0510.gtex_mixture.global.pr.pdf -s png
Here I simulated artificial mixture of 6 conditions,
I only ran it in 100 replicates (genes), and on 300 variables, because some mt-hess runs takes days to complete.
Note here I have to work on 1 - condition specific lfsr
as a proxy to condition specific PIP reported by MT-HESS.
%cd ../mthess_20200526
%preview PIP_comparison_0526.artificial_mixture_small.pip_evaluation.png
%preview PIP_comparison_0526.artificial_mixture_small.roc.pdf -s png
%preview PIP_comparison_0526.artificial_mixture_small.pr.pdf -s png