Multivariate Bayesian variable selection regression

Decreasing ELBO issue

Possibly caused by comparing and setting prior variances to zero when estimating them with EM.

Context

In mvSuSiE model with mixture prior, to estimate multivariate prior $b \sim \sum_k \pi_k MVN(0, \sigma_0^2U_k)$ we fix $U_k$ and $\pi_k$ and only estimate the scalar $\sigma_0^2$.

Recent updates to mvsusieR package implemented EM updates for $\sigma_0^2$ in both MASH regression and Bayesian multivariate regression (hereafter BMR, essentially MASH with just one component but we have separate, simpler code for it written in R for prototyping purpose). I verified with some tests that for IBSS using MASH mixture with only one matrix the result agree with IBSS using BMR. And BMR, when reduced to just one response, agrees with susieR::susie result. The EM estimate for prior variance scalar are identical and ELBO are non-decreasing.

Problem

However, as I moved on to more simulations I get warnings for decreasing ELBO with the EM estimates. It seems to happen only when there are many effect variables relative to conditions. For example my toy example below has 2 conditions each having 2 effect variables out of 50 variables. The problem is that EM method, as it is currently implemented, estimates $\sigma_0^2$ then at the end of an iteration, compares the update with zero and accept whichever has better likelihood for that single effect model. This resulted in decreasing ELBO.

So far we haven't observed the behavior in SuSiE -- no matter how many effect variables are there, the ELBO is non-decreasing using EM updates. I think both Yuxin and Kaiqian has done some benchmark with SuSiE's EM for prior but don't see an issue.

Possible cause

If I switch off the comparison between current estimate and zero at the end of each EM update (comment out this line), then in this example the algorithm takes longer to converge but ELBO is non-decreasing, and is eventually larger than any other approaches (eg, simple and optim methods). The estimates make more sense compared to the "truth" we know for this simulated data.

I think since EM is not directly maximizing that single effect loglik, we should not compare it with zero at each iteration? Otherwise comparing it and setting $\sigma_0^2 = 0$ will provide a very bad initialization for upcoming iterations.

To see the practical impact: a major difference between EM update and maximizing the single effect loglik directly is that the former is a function of posterior quantities whereas the latter only involves the data, with the prior to be estimated (here for instance). As a result, in EM updates if in a previous iteration sets the estimate to zero then it will impact the posterior of the next iteration and the estimate will never come back from zero. This is not the case for when prior is directly estimated as in optim.

Illustration

Here I use BMR not MASH regression to illustrate. To reproduce you need to run on R 3.6.1+.

In [1]:
devtools::load_all('~/GIT/software/mvsusieR')
set.seed(1)
dat = mvsusie_sim1(n=50,p=50,r=2,s=2)
L = 10
Loading mvsusieR

Loading required package: mashr

Loading required package: ashr

There are 2 conditions, each having 2 effect variables.

In [2]:
sum(rowSums(dat$b)>0)
4

Fit without estimating prior

We use fixed prior here provided by simulated data object; everything looks fine:

In [3]:
dat$V
A matrix: 2 × 2 of type dbl
0.466739600.05659291
0.056592910.50840182
In [4]:
fit0 = mvsusie(dat$X,dat$y,L=L,prior_variance=dat$V, 
              compute_objective=T, estimate_residual_variance=F, 
              estimate_prior_variance=F)
In [5]:
fit0$elbo
  1. -198.044780051201
  2. -196.062858410187
  3. -195.768521254949
  4. -195.608472451142
  5. -195.361025036579
  6. -195.242309228748
  7. -195.227677853132
  8. -195.225650183385
  9. -195.224819801876
In [6]:
fit0$V
  1. 1
  2. 1
  3. 1
  4. 1
  5. 1
  6. 1
  7. 1
  8. 1
  9. 1
  10. 1
In [7]:
susieR::susie_plot(fit0, 'PIP', b = rowSums(dat$b))

simple method

Using simple method, only 1 out of 4 effects was captured.

In [8]:
fit1 = mvsusie(dat$X,dat$y,L=L,prior_variance=dat$V, 
              compute_objective=T, estimate_residual_variance=F, 
              estimate_prior_variance=T, estimate_prior_method='simple',
              track_fit=TRUE)
In [9]:
fit1$elbo
  1. -182.785460711142
  2. -182.785460711142
In [10]:
fit1$V
  1. 1
  2. 0
  3. 0
  4. 0
  5. 0
  6. 0
  7. 0
  8. 0
  9. 0
  10. 0

Looking at prior estimates in each iteration,

In [11]:
fit1$prior_history
    1. 1
    2. 0
    3. 0
    4. 0
    5. 0
    6. 0
    7. 0
    8. 0
    9. 0
    10. 0
    1. 1
    2. 0
    3. 0
    4. 0
    5. 0
    6. 0
    7. 0
    8. 0
    9. 0
    10. 0

EM method

Now we can see decreasing ELBO: from -183 to -185, compared to previously the ELBO was -182 at convergence,

In [12]:
fit2 = mvsusie(dat$X,dat$y,L=L,prior_variance=dat$V, 
              compute_objective=T, estimate_residual_variance=F, 
              estimate_prior_variance=T, estimate_prior_method = 'EM',
              track_fit=TRUE)
Warning message in m$get_objective(dump = TRUE):
“Objective is not non-decreasing”
In [13]:
fit2$elbo
  1. -198.044780051201
  2. -183.656245164604
  3. -185.064348616454
In [14]:
fit2$V
  1. 0
  2. 0
  3. 0
  4. 0
  5. 0
  6. 0
  7. 0
  8. 0
  9. 0
  10. 0

And no effect was captured! It seems in the first iteration both prior are estimated non-zero. But the 2nd iteration they both became zero because zero maximizes the loglik in that iteration. However apparently in the next iteration setting prior to zero decreased the ELBO:

In [16]:
fit2$prior_history
    1. 0.772252893753661
    2. 0.437599977548084
    3. 0
    4. 0
    5. 0
    6. 0
    7. 0
    8. 0
    9. 0
    10. 0
    1. 0
    2. 0
    3. 0
    4. 0
    5. 0
    6. 0
    7. 0
    8. 0
    9. 0
    10. 0
    1. 0
    2. 0
    3. 0
    4. 0
    5. 0
    6. 0
    7. 0
    8. 0
    9. 0
    10. 0

However, when I set L = 5 it can capture 1 effect,

In [17]:
fit21 = mvsusie(dat$X,dat$y,L=4,prior_variance=dat$V, 
              compute_objective=T, estimate_residual_variance=F, 
              estimate_prior_variance=T, estimate_prior_method = 'EM',
              track_fit=T)
Warning message in m$get_objective(dump = TRUE):
“Objective is not non-decreasing”
In [18]:
fit21$elbo
  1. -186.2057629323
  2. -182.505278071501
  3. -183.003946868202
In [19]:
fit21$V
  1. 0.651498240163232
  2. 0
  3. 0
  4. 0
In [20]:
fit21$prior_history
    1. 0.772252893753661
    2. 0.437599977548084
    3. 0
    4. 0
    1. 0.553146246485604
    2. 0
    3. 0
    4. 0
    1. 0.651498240163232
    2. 0
    3. 0
    4. 0
In [21]:
susieR::susie_plot(fit21, 'PIP', b = rowSums(dat$b))
In [22]:
sum(fit21$pip)
1

optim method

In optim method we still have that check at the end of each update comparing current estimate with zero in terms of likelihood. Thus many V are set to zero. However, two effects were captured this time,

In [23]:
fit3 = mvsusie(dat$X,dat$y,L=L,prior_variance=dat$V, 
              compute_objective=T, estimate_residual_variance=F, 
              estimate_prior_variance=T, estimate_prior_method='optim',
              track_fit=T)
In [24]:
fit3$elbo
  1. -182.651444503913
  2. -182.400440264486
  3. -182.281808669658
  4. -182.243681010512
  5. -182.126019092252
  6. -181.788240766637
  7. -181.653543757783
  8. -181.642803840152
  9. -181.641803585595
  10. -181.641687202537
In [25]:
fit3$V
  1. 0.715781056508291
  2. 0.697643195745796
  3. 0
  4. 0
  5. 0
  6. 0
  7. 0
  8. 0
  9. 0
  10. 0

It seems unlike with the EM update, the estimates never hit zero:

In [26]:
fit3$prior_history
    1. 0.718104560164367
    2. 0.224801351426067
    3. 0
    4. 0
    5. 0
    6. 0
    7. 0
    8. 0
    9. 0
    10. 0
    1. 0.670096870076645
    2. 0.389241325606643
    3. 0
    4. 0
    5. 0
    6. 0
    7. 0
    8. 0
    9. 0
    10. 0
    1. 0.698217111519865
    2. 0.464535584071289
    3. 0
    4. 0
    5. 0
    6. 0
    7. 0
    8. 0
    9. 0
    10. 0
    1. 0.707982718058914
    2. 0.480474396331732
    3. 0
    4. 0
    5. 0
    6. 0
    7. 0
    8. 0
    9. 0
    10. 0
    1. 0.708815689491669
    2. 0.516421119893738
    3. 0
    4. 0
    5. 0
    6. 0
    7. 0
    8. 0
    9. 0
    10. 0
    1. 0.720830675177921
    2. 0.629771919574128
    3. 0
    4. 0
    5. 0
    6. 0
    7. 0
    8. 0
    9. 0
    10. 0
    1. 0.723375689728803
    2. 0.684168873257886
    3. 0
    4. 0
    5. 0
    6. 0
    7. 0
    8. 0
    9. 0
    10. 0
    1. 0.718308377615937
    2. 0.694134021478436
    3. 0
    4. 0
    5. 0
    6. 0
    7. 0
    8. 0
    9. 0
    10. 0
    1. 0.716446884365248
    2. 0.696830471900811
    3. 0
    4. 0
    5. 0
    6. 0
    7. 0
    8. 0
    9. 0
    10. 0
    1. 0.715781056508291
    2. 0.697643195745796
    3. 0
    4. 0
    5. 0
    6. 0
    7. 0
    8. 0
    9. 0
    10. 0
In [27]:
susieR::susie_plot(fit3, 'PIP', b = rowSums(dat$b))

EM update without checking with zero

I commented out this line) in my code, and run EM with L=10:

In [28]:
devtools::load_all('~/GIT/software/mvsusieR')
Loading mvsusieR

In [33]:
fit4 = mvsusie(dat$X,dat$y,L=L,prior_variance=dat$V, 
              compute_objective=T, estimate_residual_variance=F, 
              estimate_prior_variance=T, estimate_prior_method='EM', max_iter = 200,
              track_fit=T)
In [34]:
fit4$elbo
  1. -198.044780051201
  2. -185.808446867106
  3. -184.186223744929
  4. -183.748348515929
  5. -183.517946916971
  6. -183.347957226874
  7. -183.228141911754
  8. -183.158819669811
  9. -183.122288636005
  10. -183.100822326533
  11. -183.08558841485
  12. -183.073211704825
  13. -183.062327230459
  14. -183.052198977251
  15. -183.042274528671
  16. -183.03201305774
  17. -183.020763776462
  18. -183.007608403357
  19. -182.991088736146
  20. -182.968679236777
  21. -182.935727526705
  22. -182.88344571863
  23. -182.796386513956
  24. -182.656681568024
  25. -182.475109898781
  26. -182.31519771353
  27. -182.218387872279
  28. -182.166127784205
  29. -182.132652406407
  30. -182.106278000122
  31. -182.083266460384
  32. -182.062470319903
  33. -182.043454610881
  34. -182.025980421308
  35. -182.009872422002
  36. -181.994984702269
  37. -181.981190998938
  38. -181.96838088548
  39. -181.956457457973
  40. -181.945335496277
  41. -181.934939885653
  42. -181.925204251526
  43. -181.916069785292
  44. -181.907484239335
  45. -181.899401068683
  46. -181.891778697851
  47. -181.88457989376
  48. -181.877771228245
  49. -181.871322616192
  50. -181.865206917615
  51. -181.859399593881
  52. -181.853878409897
  53. -181.84862317543
  54. -181.843615519816
  55. -181.838838695242
  56. -181.83427740455
  57. -181.829917650122
  58. -181.825746600967
  59. -181.821752475532
  60. -181.817924438144
  61. -181.814252507307
  62. -181.810727474311
  63. -181.807340830853
  64. -181.804084704534
  65. -181.800951801277
  66. -181.797935353803
  67. -181.795029075472
  68. -181.792227118822
  69. -181.789524038298
  70. -181.786914756664
  71. -181.784394534698
  72. -181.781958943801
  73. -181.7796038412
  74. -181.777325347474
  75. -181.775119826142
  76. -181.772983865106
  77. -181.770914259761
  78. -181.768907997584
  79. -181.766962244078
  80. -181.765074329917
  81. -181.763241739179
  82. -181.761462098566
  83. -181.759733167516
  84. -181.758052829118
  85. -181.756419081757
  86. -181.754830031426
  87. -181.753283884637
  88. -181.75177894189
  89. -181.750313591628
  90. -181.748886304666
  91. -181.747495629019
  92. -181.746140185128
  93. -181.744818661421
  94. -181.743529810206
  95. -181.742272443847
  96. -181.741045431215
  97. -181.739847694388
  98. -181.738678205572
  99. -181.737535984239
  100. -181.736420094456
  101. -181.735329642387
  102. -181.734263773969
  103. -181.733221672733
  104. -181.73220255777
  105. -181.731205681822
In [35]:
fit4$niter
105

It took 105 iterations to converge. The ELBO -181 is comparable to that obtained by optim.

The prior estimates are:

In [36]:
fit4$V
  1. 0.714071252664812
  2. 0.688578701959507
  3. 0.00337649436178791
  4. 0.00331738817397583
  5. 0.00329732911004534
  6. 0.0032842114304321
  7. 0.00327467368818631
  8. 0.0032673322654283
  9. 0.00326146768668188
  10. 0.00325665554106171
In [37]:
fit4$prior_history
    1. 0.772252893753661
    2. 0.437599977548084
    3. 0.230816611326831
    4. 0.182712570727553
    5. 0.161966528494426
    6. 0.14890498820585
    7. 0.13983830974034
    8. 0.133145589210748
    9. 0.127990831154109
    10. 0.123895510397412
    1. 0.348780576745262
    2. 0.204830733993712
    3. 0.13264147204938
    4. 0.113990158776202
    5. 0.105131394536457
    6. 0.0988132823002502
    7. 0.0939315953550189
    8. 0.08998920889978
    9. 0.0867123175151383
    10. 0.0839316058941495
    1. 0.318314281408707
    2. 0.139151332826073
    3. 0.0989184485105305
    4. 0.088025622062182
    5. 0.0827403500789719
    6. 0.0788326183833658
    7. 0.0757231565949973
    8. 0.0731429957410007
    9. 0.0709404952208998
    10. 0.0690210406399394
    1. 0.352166997013485
    2. 0.110260897866691
    3. 0.0817942818277802
    4. 0.0741443384012128
    5. 0.0704777140198969
    6. 0.0677079834574479
    7. 0.0654576979152978
    8. 0.063556714774443
    9. 0.0619095716883881
    10. 0.0604560841632362
    1. 0.413484033088573
    2. 0.0947074401725199
    3. 0.0712504928043032
    4. 0.0651749881003786
    5. 0.0623503448986931
    6. 0.0602177374226174
    7. 0.0584767368155838
    8. 0.0569951764452616
    9. 0.0557005765104261
    10. 0.0545481710432499
    1. 0.492818989767097
    2. 0.085688759457636
    3. 0.0642963273842031
    4. 0.0590146733071478
    5. 0.0566290509296431
    6. 0.0548435910246225
    7. 0.053392638402784
    8. 0.0521602967311602
    9. 0.0510835311395561
    10. 0.0501237625868888
    1. 0.571321213803357
    2. 0.0803050756915857
    3. 0.059490918443972
    4. 0.0546285486319528
    5. 0.052487920612162
    6. 0.0509008770130016
    7. 0.0496192280051213
    8. 0.0485354349202314
    9. 0.047591320046614
    10. 0.0467514369537998
    1. 0.629836978546416
    2. 0.0770340100656435
    3. 0.0559640252179707
    4. 0.0513364261338768
    5. 0.049349151218669
    6. 0.0478897635245875
    7. 0.0467187789621496
    8. 0.0457331928928537
    9. 0.0448776189723954
    10. 0.0441184836044137
    1. 0.66408769558506
    2. 0.0751062196275566
    3. 0.0532184552515274
    4. 0.0487242877244731
    5. 0.0468409761100616
    6. 0.0454714997457457
    7. 0.0443801196212489
    8. 0.0434661517274231
    9. 0.0426757817608774
    10. 0.0419765596920409
    1. 0.680751264049956
    2. 0.0741380079465696
    3. 0.0509761504508198
    4. 0.0465545846411982
    5. 0.0447451949264225
    6. 0.0434425389560861
    7. 0.0424117258009453
    8. 0.0415530911513403
    9. 0.0408136626861662
    10. 0.0401616533673864
    1. 0.687506585922961
    2. 0.0739350861197195
    3. 0.0490726500657946
    4. 0.0446863522978146
    5. 0.0429319414109733
    6. 0.0416812472746624
    7. 0.0406985541983706
    8. 0.0398844748074473
    9. 0.039186461748461
    10. 0.0385731256230509
    1. 0.689374044876907
    2. 0.0744110778521846
    3. 0.0474042475014478
    4. 0.0430309268117504
    5. 0.0413196317317423
    6. 0.0401112999847064
    7. 0.0391684891320084
    8. 0.0383916924597413
    9. 0.0377285642976069
    10. 0.0371479694297903
    1. 0.689032397169939
    2. 0.0755527407800426
    3. 0.0459013709634381
    4. 0.0415291847382497
    5. 0.039853927710144
    6. 0.0386819115559522
    7. 0.037773600340774
    8. 0.0370292052353413
    9. 0.036396486424101
    10. 0.0358444981454197
    1. 0.687797641306849
    2. 0.0774075631425995
    3. 0.0445142180630376
    4. 0.0401392983217729
    5. 0.0384964529125933
    6. 0.0373572395700764
    7. 0.0364800776659467
    8. 0.0357649086255916
    9. 0.0351595956502203
    10. 0.0346333738158734
    1. 0.686301928438894
    2. 0.0800866606457768
    3. 0.0432045764154652
    4. 0.0388297799911545
    5. 0.037218399955907
    6. 0.0361104025577631
    7. 0.0352625697503015
    8. 0.0345747310276004
    9. 0.0339949247368479
    10. 0.0334926023357102
    1. 0.684858771726241
    2. 0.0837824656141522
    3. 0.0419407891565497
    4. 0.0375752245648316
    5. 0.035996622818784
    6. 0.0349198303552678
    7. 0.0341007390913323
    8. 0.0334393572851371
    9. 0.0328840370488476
    10. 0.0324045199929872
    1. 0.68364484775395
    2. 0.0888064626283074
    3. 0.0406942121427318
    4. 0.0363533578783541
    5. 0.0348109356685831
    6. 0.0337667457812159
    7. 0.0329768889266027
    8. 0.0323419758569724
    9. 0.0318108759708623
    10. 0.0313537313294974
    1. 0.682791771367817
    2. 0.0956606415727144
    3. 0.0394361778085536
    4. 0.0351425682021279
    5. 0.0336418620740864
    6. 0.0326330688490092
    7. 0.0318739890079384
    8. 0.0312664048110158
    9. 0.0307599761319298
    10. 0.0303253944049722
    1. 0.682439189329662
    2. 0.10517186698894
    3. 0.0381347880466546
    4. 0.0339193467644676
    5. 0.0324683022978105
    6. 0.0314992403389492
    7. 0.030773619486854
    8. 0.0301951338749748
    9. 0.0297145885415539
    10. 0.0293034180656758
    1. 0.682776831077307
    2. 0.118748689983147
    3. 0.0367511399294308
    4. 0.0326552372759695
    5. 0.0312647228677088
    6. 0.0303415735443374
    7. 0.0296534459968443
    8. 0.0291068994763304
    9. 0.0286543400266742
    10. 0.0282681947279377
    1. 0.684091353053259
    2. 0.138871938814153
    3. 0.0352356563582494
    4. 0.0313135374038589
    5. 0.029997984365979
    6. 0.029129170154857
    7. 0.0284842039499602
    8. 0.0279737255628424
    9. 0.0275523227978742
    10. 0.0271937333491561
    1. 0.686819622658178
    2. 0.169956413756255
    3. 0.0335296802576374
    4. 0.0298489815956523
    5. 0.0286264515356364
    6. 0.0278226676861181
    7. 0.0272281815773375
    8. 0.0267591992822108
    9. 0.0263732001635298
    10. 0.0260456217903998
    1. 0.691550638147311
    2. 0.219302671057312
    3. 0.0315905340592951
    4. 0.028223310515274
    5. 0.0271126837757984
    6. 0.0263850714424433
    7. 0.0258486754390097
    8. 0.025426829661028
    9. 0.025080650148953
    10. 0.0247876828053348
    1. 0.698700067781788
    2. 0.295322027501959
    3. 0.029464964261296
    4. 0.0264585438760271
    5. 0.0254706895313862
    6. 0.0248251201557767
    7. 0.0243504679982151
    8. 0.0239782111643958
    9. 0.0236735820981637
    10. 0.0234164955991146
    1. 0.707367182485501
    2. 0.396079328398168
    3. 0.0273378722630898
    4. 0.0246840408994679
    5. 0.0238130801578658
    6. 0.023244663825073
    7. 0.0228274699238083
    8. 0.0225009353420961
    9. 0.0222343042623042
    10. 0.022009795424786
    1. 0.714572811657588
    2. 0.497135080072082
    3. 0.0253916185321135
    4. 0.0230428659324804
    5. 0.0222721354038772
    6. 0.0217694559802882
    7. 0.0214008943500442
    8. 0.0211127971327255
    9. 0.0208778946799568
    10. 0.0206804096352994
    1. 0.717930371599802
    2. 0.571760111179281
    3. 0.0236745799063452
    4. 0.0215812497395732
    5. 0.0208944761703754
    6. 0.0204468047576322
    7. 0.0201188646886408
    8. 0.0198627997629438
    9. 0.0196542697316358
    10. 0.0194791833547233
    1. 0.717935022790402
    2. 0.615743551705277
    3. 0.0221599422389636
    4. 0.0202828882057191
    5. 0.01966730036047
    6. 0.0192662899648304
    7. 0.0189728090310319
    8. 0.0187439119866257
    9. 0.0185577422744155
    10. 0.0184016393578122
    1. 0.716378273616803
    2. 0.638761674020014
    3. 0.0208133079071682
    4. 0.0191220774873128
    5. 0.0185675963721936
    6. 0.0182066230141149
    7. 0.0179426936216939
    8. 0.0177370818456674
    9. 0.0175700659841651
    10. 0.0174302157367816
    1. 0.714624559675707
    2. 0.650361596177655
    3. 0.0196081739097604
    4. 0.0180780919797861
    5. 0.0175764815027897
    6. 0.0172501051393335
    7. 0.0170116786291695
    8. 0.0168261353256179
    9. 0.0166756057107856
    10. 0.0165497271086083
    1. 0.713265379414631
    2. 0.656317627616281
    3. 0.0185242877434431
    4. 0.0171347946574883
    5. 0.0166791951473046
    6. 0.0163828813852398
    7. 0.0161665797943953
    8. 0.0159984179163056
    9. 0.0158621432251369
    10. 0.0157483253561603
    1. 0.712395532496227
    2. 0.659587005446953
    3. 0.0175453205786428
    4. 0.0162790501332988
    5. 0.0158636938492546
    6. 0.0155936366700626
    7. 0.0153966285429969
    8. 0.0152435983290387
    9. 0.0151197120745711
    10. 0.0150163575895526
    1. 0.711914119493346
    2. 0.661593447623953
    3. 0.0166576815744199
    4. 0.0154998591273003
    5. 0.0151198656489364
    6. 0.0148728526802205
    7. 0.0146927535461347
    8. 0.0145529643016711
    9. 0.0144399008731037
    10. 0.0143456722260588
    1. 0.711689223264453
    2. 0.663007043067616
    3. 0.0158499051136934
    4. 0.0147878994545063
    5. 0.014439116978875
    6. 0.0142124223211076
    7. 0.0140472135582941
    8. 0.0139190682833483
    9. 0.0138155081119378
    10. 0.0137292802449123
    1. 0.711616411115853
    2. 0.664139900285655
    3. 0.0151122570851257
    4. 0.0141352250395618
    5. 0.0138141040254128
    6. 0.0136054024331161
    7. 0.0134533658735784
    8. 0.0133355081930423
    9. 0.0132403328909006
    10. 0.0131611542032889
    1. 0.711626965894996
    2. 0.665135756531882
    3. 0.0144364378764538
    4. 0.0135350352594547
    5. 0.0132385267211916
    6. 0.0130458246836371
    7. 0.012905490142265
    8. 0.0127967617265973
    9. 0.0127090180718384
    10. 0.0126360790507399
    1. 0.711679993605135
    2. 0.666059471270681
    3. 0.0138153434403091
    4. 0.0129814875548273
    5. 0.012706960641614
    6. 0.0125285408001579
    7. 0.0123986439020611
    8. 0.0122980503338365
    9. 0.0122169214570596
    10. 0.0121495296174453
    1. 0.711752704917154
    2. 0.666939224592037
    3. 0.0132428694690824
    4. 0.0124695429795959
    5. 0.0122147178141855
    6. 0.0120490939059815
    7. 0.0119285420377105
    8. 0.0118352253449686
    9. 0.0117600076084064
    10. 0.0116975674317375
    1. 0.711833025820957
    2. 0.667786442038861
    3. 0.0127137501738652
    4. 0.0119948386051693
    5. 0.0117577312819485
    6. 0.011603611420182
    7. 0.0114914560892641
    8. 0.0114046725232039
    9. 0.0113347568693
    10. 0.0112767536616012
    1. 0.711914856754983
    2. 0.66860517738568
    3. 0.0122234253552291
    4. 0.0115535820567503
    5. 0.0113324593818973
    6. 0.0111887159700374
    7. 0.0110841302392361
    8. 0.0110032321890892
    9. 0.0109380890042288
    10. 0.0108840758293036
    1. 0.711995288884021
    2. 0.669396480346404
    3. 0.0117679306380133
    4. 0.0111424642414749
    5. 0.0109358063168717
    6. 0.010801451269838
    7. 0.010703711231275
    8. 0.0106281324196915
    9. 0.0105672991760556
    10. 0.010516886212389
    1. 0.712073056963204
    2. 0.670160376449399
    3. 0.0113438066423765
    4. 0.0107585869786406
    5. 0.0105650561303184
    6. 0.0104392203634453
    7. 0.010347689846972
    8. 0.0102769331418631
    9. 0.0102200042479293
    10. 0.010172850067147
    1. 0.712147711977328
    2. 0.670896717436744
    3. 0.0109480236204256
    4. 0.0103994028169853
    5. 0.0102178176785236
    6. 0.0100997340466022
    7. 0.0100138519321496
    8. 0.00994747925781517
    9. 0.00989409767559455
    10. 0.00984990205395828
    1. 0.712219192042848
    2. 0.671605506832625
    3. 0.010577918727514
    4. 0.0100626648200427
    5. 0.00989197862291806
    6. 0.00978096766249653
    7. 0.00970023730058625
    8. 0.00963786124305582
    9. 0.00958771152718067
    10. 0.00954620948780029
    1. 0.712287606085536
    2. 0.672286990278562
    3. 0.0102311436299259
    4. 0.00974638451694263
    5. 0.00958566682912895
    6. 0.00948112478981068
    7. 0.00940510513797334
    8. 0.00934638192539751
    9. 0.00929918441324262
    10. 0.00926014126229229
    1. 0.712353128162681
    2. 0.672941647694482
    3. 0.00990562058766826
    4. 0.0094487965584478
    5. 0.00929721786151757
    6. 0.00919860661531614
    7. 0.00912690477836169
    8. 0.00907152838348486
    9. 0.00902703432234929
    10. 0.00899024149276661
    1. 0.712415947973042
    2. 0.673570150551752
    3. 0.0095995055058812
    4. 0.0091683288960574
    5. 0.00902514750867101
    6. 0.00893198600741166
    7. 0.0088642509319435
    8. 0.00881194809499149
    9. 0.00876993553625954
    10. 0.00873520709104316
    1. 0.712476249216636
    2. 0.674173312155431
    3. 0.00931115673451841
    4. 0.00890357752607895
    5. 0.00876812847502858
    6. 0.0086799854894934
    7. 0.00861590261195244
    8. 0.00856642862226741
    9. 0.00852669894656699
    10. 0.00849386862371724
    1. 0.712534201358535
    2. 0.674752042151695
    3. 0.00903910862591347
    4. 0.00865328502050318
    5. 0.00852497053473088
    6. 0.00844145845963973
    7. 0.00838074514565903
    8. 0.00833388025178089
    9. 0.00829625521581269
    10. 0.0082651739202903
    1. 0.712589957577481
    2. 0.675307308859888
    3. 0.00878204904443863
    4. 0.00841632221085129
    5. 0.00829460357320914
    6. 0.00821537312218559
    7. 0.00815777476550327
    8. 0.00811332110813648
    9. 0.00807764032480634
    10. 0.00804817399118275
    1. 0.712643655370983
    2. 0.675840109748111
    3. 0.00853880017069343
    4. 0.0081916725070354
    5. 0.00807606304618079
    6. 0.00800079869284299
    7. 0.00794608536626094
    8. 0.00790386434818482
    9. 0.00786998312824741
    10. 0.00784201089215688
    1. 0.712695418142054
    2. 0.676351449117225
    3. 0.00830830206187882
    4. 0.00797841842647187
    5. 0.00786847746966319
    6. 0.00779689351664647
    7. 0.00774485708690443
    8. 0.00770470710952634
    9. 0.00767249460612939
    10. 0.00764590723408721
    1. 0.712745357007076
    2. 0.676842321714503
    3. 0.00808959852611763
    4. 0.00777572998385268
    5. 0.00767105762245059
    6. 0.00760289480085414
    7. 0.00755334643486132
    8. 0.0075151209436461
    9. 0.00748445855170148
    10. 0.00745915708799021
    1. 0.712793572505735
    2. 0.677313701034564
    3. 0.00788182494617088
    4. 0.00758285465279205
    5. 0.00748308719747897
    6. 0.00741810971597293
    7. 0.00737087771840954
    8. 0.00733444350951111
    9. 0.00730522348028534
    10. 0.00728111807693821
    1. 0.712840156100687
    2. 0.677766531236874
    3. 0.0076841977509955
    4. 0.00739910865993292
    5. 0.00730391468321233
    6. 0.00724190765965205
    7. 0.0071968355921464
    8. 0.00716207134073249
    9. 0.00713419557888455
    10. 0.00711120448069987
    1. 0.712885191446939
    2. 0.678201721808466
    3. 0.00749600528483764
    4. 0.00722386941231577
    5. 0.00713294629266847
    6. 0.00707371351217183
    7. 0.0070306585525586
    8. 0.00699745352995533
    9. 0.00697083254579399
    10. 0.00694888120709544
    1. 0.712928755448499
    2. 0.678620144284648
    3. 0.00731659986538681
    4. 0.00705656889169445
    5. 0.00696963978757893
    6. 0.00691300174012553
    7. 0.00687183324707895
    8. 0.00684008619927788
    9. 0.00681463819352777
    10. 0.00679365850727458
    1. 0.712970919132202
    2. 0.67902263049542
    3. 0.00714539085677592
    4. 0.00689668787646355
    5. 0.00681349906973068
    6. 0.00675929122782958
    7. 0.0067198894817383
    8. 0.00668950764624724
    9. 0.00666515770830912
    10. 0.00664508733133706
    1. 0.713011748369916
    2. 0.679409971929999
    3. 0.00698183861136498
    4. 0.00674375087407711
    5. 0.00666406943178406
    6. 0.00661214073493921
    7. 0.00657439583048293
    8. 0.00654529407215467
    9. 0.00652197347588188
    10. 0.00650275523666251
    1. 0.713051304477285
    2. 0.679782919909545
    3. 0.00682544915745865
    4. 0.00659732166519368
    5. 0.00652093337661592
    6. 0.00647114489444091
    7. 0.00643495576412915
    8. 0.00640705581361936
    9. 0.00638470139713729
    10. 0.00636628277458983
    1. 0.713089644712774
    2. 0.680142186333799
    3. 0.00667576952930817
    4. 0.00645699937600051
    5. 0.00638370692814663
    6. 0.00633593067823955
    7. 0.00630120422932838
    8. 0.00627443401033564
    9. 0.00625298762850473
    10. 0.0062353202921704
    1. 0.713126822696488
    2. 0.680488444825222
    3. 0.00653238365168316
    4. 0.00632241500782677
    5. 0.00625203636819598
    6. 0.00620615426843891
    7. 0.00617280461826989
    8. 0.00614709765179248
    9. 0.0061265056916392
    10. 0.00610954509500083
    1. 0.713162888764375
    2. 0.680822332138454
    3. 0.0063949087045632
    4. 0.0061932283637185
    5. 0.00612559534359472
    6. 0.00608149828151615
    7. 0.00604944607852007
    8. 0.00602474095410209
    9. 0.00600495390497889
    10. 0.00598865892493645
    1. 0.713197890270257
    2. 0.68114444973652
    3. 0.0062629919045769
    4. 0.00606912532048912
    5. 0.00600408229589326
    6. 0.00596166930022919
    7. 0.00593084111967993
    8. 0.00590708102507541
    9. 0.00588805309651398
    10. 0.0058723857130522
    1. 0.713231871845553
    2. 0.681455365460751
    3. 0.00613630764909406
    4. 0.00594981540218297
    5. 0.00588721817283177
    6. 0.00584639567452429
    7. 0.00581672347967912
    8. 0.00579385578158665
    9. 0.00577554456281794
    10. 0.00576046957376245
    1. 0.713264875624467
    2. 0.681755615240785
    3. 0.00601455497667069
    4. 0.00583502961714077
    5. 0.00577474438648828
    6. 0.00573542555813685
    7. 0.00570684621870828
    8. 0.00568482208825996
    9. 0.00566718824422516
    10. 0.00565267301070759
    1. 0.713296941440852
    2. 0.682045704805658
    3. 0.00589745530411312
    4. 0.00572451852613563
    5. 0.005666420987886
    6. 0.00562852515217155
    7. 0.00560098001318318
    8. 0.0055797540907445
    9. 0.00556276109013914
    10. 0.00554877530900228
    1. 0.713328107001626
    2. 0.682326111368124
    3. 0.00578475040597488
    4. 0.00561805051351947
    5. 0.00556202503196523
    6. 0.00552547713084562
    7. 0.00549891162586493
    8. 0.00547844172044156
    9. 0.00546205559194375
    10. 0.00544857109183482
    1. 0.713358408040671
    2. 0.682597285262702
    3. 0.00567620060699944
    4. 0.0055154102371171
    5. 0.00546134911033259
    6. 0.00542607922789665
    7. 0.0054004425314376
    8. 0.00538068935061443
    9. 0.00536487846396522
    10. 0.00535186902230452
    1. 0.713387878456331
    2. 0.682859651524274
    3. 0.00557158316201381
    4. 0.0054163972358381
    5. 0.00536420003218902
    6. 0.00533014296498636
    7. 0.00530538767955769
    8. 0.0052863145864319
    9. 0.00527104945547634
    10. 0.00525849063386192
    1. 0.713416550435032
    2. 0.68311361139875
    3. 0.00547069080117712
    4. 0.00532082467673686
    5. 0.00527039763639318
    6. 0.00523749250585309
    7. 0.00521357437971246
    8. 0.00519514717374084
    9. 0.00518040027891484
    10. 0.00516826927484292
    1. 0.713444454563037
    2. 0.683359543780896
    3. 0.00537333042139084
    4. 0.00522851822561315
    5. 0.00517977371980533
    6. 0.00514796362204129
    7. 0.00512484129421642
    8. 0.00510702801329072
    9. 0.00509277364136098
    10. 0.00508104915441486
    1. 0.713471619928032
    2. 0.683597806576945
    3. 0.00527932190716069
    4. 0.0051393150272717
    5. 0.00509217106893641
    6. 0.00506140275782052
    7. 0.00503903752738973
    8. 0.00502180826879121
    9. 0.0050080223679325
    10. 0.00499668447882682
    1. 0.71349807421185
    2. 0.683828737991577
    3. 0.00518849706633156
    4. 0.00505306278330196
    5. 0.00500744258354503
    6. 0.00497766618344373
    7. 0.00495602180043928
    8. 0.00493934855861492
    9. 0.00492600860714717
    10. 0.00491503866821589
    1. 0.71352384377548
    2. 0.684052657740204
    3. 0.00510069866794891
    4. 0.00496961891674074
    5. 0.00492545048222096
    6. 0.00489661922722137
    7. 0.00487566170284012
    8. 0.00485951822219403
    9. 0.00484660310950725
    10. 0.0048359836453975
    1. 0.713548953737272
    2. 0.684269868188467
    3. 0.00501577957107926
    4. 0.00488884981427806
    5. 0.00484606558120071
    6. 0.00481813557803642
    7. 0.00479783301212005
    8. 0.00478219465323097
    9. 0.00476968457160391
    10. 0.00475939918908777
    1. 0.713573428045124
    2. 0.684480655421503
    3. 0.00493360193478666
    4. 0.00481063013778903
    5. 0.00476916663870667
    6. 0.00474209665092129
    7. 0.00472241907490919
    8. 0.00470726269277355
    9. 0.00469513903894585
    10. 0.00468517234489234
    1. 0.713597289543309
    2. 0.684685290245996
    3. 0.00485403650064237
    4. 0.00473484219794949
    5. 0.00469463975800953
    6. 0.00466839100918262
    7. 0.00464931024295028
    8. 0.00463461407601409
    9. 0.00462285936150577
    10. 0.00461319688816802
    1. 0.713620560034464
    2. 0.684884029128273
    3. 0.00477696194016976
    4. 0.00466137538353966
    5. 0.00462237784320386
    6. 0.00459691383731378
    7. 0.00457840335849194
    8. 0.00456414692737734
    9. 0.00455274469666574
    10. 0.00454337283353656
    1. 0.713643260337253
    2. 0.685077115071847
    3. 0.00470226426051599
    4. 0.00459012564077791
    5. 0.00455228010237581
    6. 0.00452756645959192
    7. 0.00450960128412121
    8. 0.00449576529907815
    9. 0.00448470005484401
    10. 0.00447560598641862
    1. 0.713665410340107
    2. 0.685264778437823
    3. 0.00462983626241806
    4. 0.00452099499767089
    5. 0.00448425159344363
    6. 0.00446025589983121
    7. 0.00444281247264634
    8. 0.00442937874886851
    9. 0.00441863588361178
    10. 0.00440980753247118
    1. 0.713687029051357
    2. 0.685447237711599
    3. 0.00455957704520691
    4. 0.00445389112892899
    5. 0.0044182028084785
    6. 0.00439489447826693
    7. 0.00437795057312668
    8. 0.0043649019531663
    9. 0.00435446768656982
    10. 0.00434589366126355
    1. 0.713708134646136
    2. 0.685624700219167
    3. 0.00449139155418513
    4. 0.00438872695748968
    5. 0.00435404929277501
    6. 0.00433139944198655
    7. 0.00431493406957313
    8. 0.00430225435217271
    9. 0.00429211567366015
    10. 0.00428378522092485
    1. 0.713728744510254
    2. 0.685797362796232
    3. 0.00442519016623165
    4. 0.00432542028912393
    5. 0.00429171129534619
    6. 0.00426969262571216
    7. 0.00425368594921759
    8. 0.00424135982395045
    9. 0.00423150443994484
    10. 0.00422340740084492
    1. 0.713748875281342
    2. 0.685965412413237
    3. 0.0043608883099417
    4. 0.00426389347698187
    5. 0.00423111344787509
    6. 0.00420970014007972
    7. 0.00419413339758033
    8. 0.0041821463847564
    9. 0.00417256267019808
    10. 0.0041646894398195
    1. 0.713768542887444
    2. 0.686129026759202
    3. 0.00429840611700982
    4. 0.00420407311326865
    5. 0.00417218446946931
    6. 0.00415135208486246
    7. 0.00413620751785557
    8. 0.00412454591320604
    9. 0.0041152228669357
    10. 0.00410756435730267
    1. 0.713787762583269
    2. 0.686288374787175
    3. 0.00423766810191552
    4. 0.00414588974553771
    5. 0.00411485689484276
    6. 0.00409458228485126
    7. 0.0040798430723932
    8. 0.00406849389609822
    9. 0.00405942109975154
    10. 0.00405196870567056
    1. 0.713806548984273
    2. 0.686443617223878
    3. 0.00417860286728166
    4. 0.00408927761534939
    5. 0.00405906682379453
    6. 0.00403932804634069
    7. 0.00402497824428264
    8. 0.00401392919395087
    9. 0.00400509677404758
    10. 0.00399784234161361
    1. 0.713824916098715
    2. 0.686594907046034
    3. 0.00412114283254975
    4. 0.00403417441727474
    5. 0.00400475369007232
    6. 0.00398552993237798
    7. 0.00397155441724681
    8. 0.00396079382449524
    9. 0.00395219241743783
    10. 0.00394512821496455
    1. 0.71384287735785
    2. 0.686742389925642
    3. 0.00406522398385886
    4. 0.00398052107642878
    5. 0.00395186004790057
    6. 0.00393313155511702
    7. 0.00391951597223384
    8. 0.00390903276255154
    9. 0.00390065348227723
    10. 0.00389377217343718
    1. 0.713860445644366
    2. 0.686886204646376
    3. 0.00401078564323001
    4. 0.00392826154289988
    5. 0.00390033137462532
    6. 0.0038820793837846
    7. 0.00386881009925363
    8. 0.0038585937548643
    9. 0.00385042816291935
    10. 0.00384372278190101
    1. 0.713877633319199
    2. 0.687026483493095
    3. 0.00395777025534855
    4. 0.00387734260160363
    5. 0.00385011588807991
    6. 0.00383232256691211
    7. 0.00381938662314846
    8. 0.0038094271486145
    9. 0.00380146722644258
    10. 0.00379493115495054
    1. 0.713894452246835
    2. 0.687163352616339
    3. 0.00390612319040663
    4. 0.00382771369623352
    5. 0.00380116437741181
    6. 0.00378381276761676
    7. 0.00377119784211362
    8. 0.00376148573244933
    9. 0.00375372385570583
    10. 0.00374735080164696
    1. 0.713910913819178
    2. 0.687296932373549
    3. 0.00385579256161904
    4. 0.00377932676610946
    5. 0.00375343004623186
    6. 0.00373650401083298
    7. 0.00372419837789717
    8. 0.00371472458898081
    9. 0.00370715350370288
    10. 0.00370093748141653
    1. 0.71392702897811
    2. 0.687427337648625
    3. 0.00380672905615993
    4. 0.00373213609483951
    5. 0.00370686836705577
    6. 0.00369035254149859
    7. 0.00367834503670904
    8. 0.00366910095780333
    9. 0.0036617137582817
    10. 0.00365564907018535
    1. 0.713942808236788
    2. 0.687554678151321
    3. 0.00375888577838841
    4. 0.00368609816981323
    5. 0.00366143694610448
    6. 0.00364531669279392
    7. 0.00363359667996043
    8. 0.00362457410816891
    9. 0.0036173642163817
    10. 0.00361144543591567
    1. 0.713958261699791
    2. 0.687679058697873
    3. 0.00371221810433813
    4. 0.00364117155163635
    5. 0.00361709539761673
    6. 0.00360135676361536
    7. 0.00358991410403553
    8. 0.00358110522053792
    9. 0.00357406636701986
    10. 0.00356828832278531
    1. 0.713973399082162
    2. 0.687800579474154
    3. 0.00366668354654207
    4. 0.00359731675269912
    5. 0.0035738052269053
    6. 0.00355843490453991
    7. 0.00354725992837055
    8. 0.00353865727629482
    9. 0.00353178348232649
    10. 0.00352614124332085
    1. 0.713988229727444
    2. 0.687919336282553
    3. 0.0036222416283501
    4. 0.00355449612414444
    5. 0.00353152972145798
    6. 0.00351651501160481
    7. 0.00350559849118054
    8. 0.00349719495498198
    9. 0.00349048051599461
    10. 0.00348496937785693
    1. 0.714002762624733
    2. 0.688035420773693
    3. 0.00357885376697389
    4. 0.00351267375056823
    5. 0.00349023384944755
    6. 0.00347556262728662
    7. 0.00346489575223351
    8. 0.00345668453846281
    9. 0.00345012400856302
    10. 0.00344473948075003
    1. 0.714017006424841
    2. 0.688148920664015
    3. 0.00353648316456321
    4. 0.00347181535184429
    5. 0.00344988416507125
    6. 0.003435544848119
    7. 0.0034251192021243
    8. 0.0034170938214771
    9. 0.00341068199900472
    10. 0.00340541979282495
    1. 0.714030969455596
    2. 0.688259919940198
    3. 0.00349509470668038
    4. 0.00343188819151958
    5. 0.00341044872019173
    6. 0.00339643023843759
    7. 0.00338623777754879
    8. 0.00337839202809837
    9. 0.00337212394213802
    10. 0.00336697995957812
    1. 0.714044659736358
    2. 0.688368499051286
    3. 0.0034546548675954
    4. 0.00339286099127453
    5. 0.00337189698179692
    6. 0.0033581887497846
    7. 0.00334822178212208
    8. 0.00334054973364553
    9. 0.00333442063141948
    10. 0.00332939095470217
    1. 0.714058084991773
    2. 0.68847473508936
    3. 0.00341513162187545
    4. 0.00335470385098673
    5. 0.00333419975483845
    6. 0.00332079164554625
    7. 0.00331104281232344
    8. 0.00330353879163926
    9. 0.00329754412671528
    10. 0.00329262500853404
    1. 0.714071252664812
    2. 0.688578701959507
    3. 0.00337649436178791
    4. 0.00331738817397583
    5. 0.00329732911004534
    6. 0.0032842114304321
    7. 0.00327467368818631
    8. 0.0032673322654283
    9. 0.00326146768668188
    10. 0.00325665554106171

Here it captures 2 effects, as also reflected below that two effects have large PIP,

In [58]:
susieR::susie_plot(fit4, 'PIP', b = rowSums(dat$b))

Final solution

Following from Matthew's suggestion, we check with zero and set all posteriors to zero, only after 10 iterations. This is the current default implementation of EM method,

In [2]:
fit5 = mvsusie(dat$X,dat$y,L=L,prior_variance=dat$V, 
              compute_objective=T, estimate_residual_variance=F, 
              estimate_prior_variance=T, estimate_prior_method='EM',
              track_fit=T)

The convergence is now faster,

In [3]:
fit5$niter
34
In [4]:
fit5$elbo
  1. -198.044780051201
  2. -185.808446867106
  3. -184.186223744929
  4. -183.748348515929
  5. -183.517946916971
  6. -183.347957226874
  7. -183.228141911754
  8. -183.158819669811
  9. -183.122288636005
  10. -183.100822326533
  11. -183.044586444743
  12. -182.861451903339
  13. -182.818408667179
  14. -182.783486354127
  15. -182.756119965729
  16. -182.750229117781
  17. -182.735071648099
  18. -182.730465406676
  19. -182.725419263242
  20. -182.718932307953
  21. -182.710280433489
  22. -182.698368211485
  23. -182.681350548334
  24. -182.656000311233
  25. -182.61671773808
  26. -182.555042844271
  27. -182.463696168948
  28. -182.344595986929
  29. -182.173990294663
  30. -181.860343382417
  31. -181.680238281792
  32. -181.647334578022
  33. -181.642486688434
  34. -181.641790080555

This now achieves the same ELBO as optim method.

In [5]:
fit5$V
  1. 0.716272294674902
  2. 0
  3. 0
  4. 0
  5. 0
  6. 0
  7. 0
  8. 0
  9. 0.694622119578786
  10. 0

Here, two effects are captured, with very similar result compared to optim; but they appear in effects 1 and 9!

In [6]:
susieR::susie_plot(fit5, 'PIP', b = rowSums(dat$b))

Copyright © 2016-2020 Gao Wang et al at Stephens Lab, University of Chicago