Skip to content

[lme4 package] Does the order of rows(trials) in a data frame (long format) affect the results of the lmer model (maybe somehow)?

4 messages · Zhaohong, Ben Bolker, Emmanuel Curis +1 more

#
*I believe that the order shouldn't matter, but I don't know why I got
different results after fitting the same model in a data frame that is
ordered differently. *

*Here are my R codes. Did I use the order() function wrongly such that I
got different results?*

*The lmer model fitted with the data frame before it is ordered (There were
no warning messages, and the model was able to converge.):*

*Model1<-lmer(Stimulus.RT~
1+A*B*C+(1+A|Subject)+(1+A*B*C|Item)+(1+A*B*C|Category),
data=edata,verbose=2,control=lmerControl(optCtrl=list(maxfun=500000)))*

*The verbose results are as follows:*
npt = 77 , n =  75
rhobeg =  0.2 , rhoend =  2e-07
   0.020:  86:     -783.903;0.492578 -0.0462110 -0.0363706 -0.0207690
-0.00431843 0.00131644 -0.00481740 -0.00308283 0.502089 -0.0211198
-0.0336018 -0.0237567 -0.00155808 -0.0307396 -0.000390386 0.354830
-0.0308058 -0.0351383 -0.0137964 -0.0238208 -0.0133714 0.305507 -0.0174871
-0.0328382 -0.00155512 0.00996183 0.646703 -0.0258383 -0.00140789
-0.00146090 0.593574 -0.0176632 -0.0351208 0.611448 0.000434827 0.819591
0.735976 -0.105154 0.553306 0.948775 -0.00950248 -0.00976528 -0.0113362
-0.00949456 -0.00980388 -0.0100146 -0.00779947 0.946928 -0.00984788
-0.0101504 -0.00957164 -0.00986887 -0.00987130 -0.00960832 0.946926
-0.0101303 -0.0103410 -0.00976277 -0.00978197 -0.00937871 0.947412
-0.00988796 -0.0100200 -0.0103097 -0.0103002 0.949607 -0.0106541
-0.00943222 -0.0111189 0.947582 -0.0104333 -0.00883065 0.949409 -0.0119380
0.954371
  0.0020: 1266:     -1011.94;0.450579 -0.0561922 -0.0390931 0.00888335
0.0594143 0.0967819 0.0600815 0.0674752 0.220653 0.00904686 -0.0695420
-0.0628733 0.117864 -0.0596009 0.0799141 0.229251 -0.0664593 -0.123684
0.0490019 -0.0235120 -0.0541726 0.118324 -0.0741990 -0.0809683 0.0958857
0.268214 0.249829 0.0324988 0.114487 0.128747 0.0173838 0.0167991
-0.00732449 0.0501891 0.0850320 0.157297 0.746053 -0.130440 0.331480
0.293937 0.0743482 0.0377308 -0.139008 0.0682751 0.0331050 -0.0866526
-0.0383130 0.0117850 -0.0240952 0.0400393 0.0809940 -0.0429156 -0.159851
0.0126200 0.0106298 -0.0212809 0.00295026 -0.00943432 0.0602099 -0.0644152
 0.00000 0.0531561 0.00504124 0.107394 -0.0116981 0.177808 -0.0275010
0.0632146 -0.0964148 0.0448545 0.0930555 -0.0329402 0.233815 0.111024
0.684505
 0.00020: 2829:     -1024.14;0.452709 -0.0579625 -0.0344969 0.0120118
0.0734510 0.0896992 0.0499010 0.0954442 0.229133 0.00559951 -0.0696637
-0.0242122 0.111817 -0.0789206 0.0837580 0.222271 -0.0611380 -0.112126
0.0329055 -0.0277847 -0.00797160 0.120486 -0.0643358 -0.0845771 0.0736635
0.288591 0.257914 -0.00181955 0.110898 0.147722  0.00000 0.000172809
0.00919736 0.0110891 0.0152384 0.00828677 0.748922 -0.123172 0.330259
0.206775 0.0476522 0.0280362 -0.101461 0.0463152 0.0216698 0.0248654
0.170800 0.0176330 -0.0380416 0.0317560 0.142628 -0.0190826 -0.0671519
-0.00752021  0.00000 -0.0188540 0.0493985 -0.0278924 0.112063 -0.127799
 0.00000 0.0437870 -0.0209737 0.0658322 -0.0777777 0.0965694 -0.0408791
0.127360 -0.174456  0.00000 -0.000274027 -0.0130361 2.24477e-05 9.26975e-05
0.0325827
 2.0e-05: 5773:     -1024.50;0.453311 -0.0585468 -0.0356171 0.0124542
0.0734133 0.0881771 0.0507491 0.0961264 0.230039 0.00643291 -0.0684440
-0.0227084 0.110856 -0.0735117 0.0881374 0.222234 -0.0600243 -0.108611
0.0331596 -0.0250605 -0.00237720 0.120825 -0.0585146 -0.0847106 0.0774464
0.290086 0.261511 -3.75224e-05 0.115623 0.142874  0.00000 0.000183608
0.000936919 0.000697002 0.000660236 0.00131320 0.749940 -0.124134 0.330879
0.212274 0.0477193 0.0297672 -0.103379 0.0384115 0.0255082 0.0174492
0.192128 0.0200405 -0.0419245 0.0218963 0.182316 -0.0385164 0.00556693
-0.0984504 0.00132058 -0.0348845 0.0586401 -0.0444801 0.190766 -0.212638
 0.00000 0.000668956 -0.000403147 0.00142120 -0.00175281  0.00000
0.000166863 -0.00106768 0.000914863 0.000467674 -0.00274150 0.00346480
0.00114500 -0.000318531 0.00111171
 2.0e-06: 9248:     -1024.51;0.453348 -0.0584731 -0.0355746 0.0125052
0.0734903 0.0881831 0.0507601 0.0960166 0.230127 0.00645362 -0.0684865
-0.0230384 0.110978 -0.0737085 0.0876321 0.222118 -0.0599844 -0.108724
0.0331173 -0.0250216 -0.00259363 0.120773 -0.0589481 -0.0846303 0.0772936
0.289112 0.261172 -0.000167296 0.115547 0.144054 9.37833e-07 -2.73741e-05
-4.00219e-05 1.07567e-05 3.85261e-05 3.10490e-05 0.750009 -0.124225
0.330861 0.212679 0.0477609 0.0298412 -0.103074 0.0371887 0.0264153
0.0144220 0.197224 0.0205217 -0.0404589 0.0134622 0.191228 -0.0472840
0.0469882 -0.142964 0.0109281 -0.0403486 0.0179550 -0.0346377 0.183949
-0.183214 0.00214351 -0.00362890 0.00276037 -0.0118387 0.0131118  0.00000
5.56133e-05 -0.000257805 0.000368822  0.00000 0.000199027 -0.000336293
0.000352925 -0.000454433 1.25530e-05
 2.0e-07: 15764:     -1024.51;0.453351 -0.0584700 -0.0355723 0.0124965
0.0734787 0.0881846 0.0507616 0.0960397 0.230123 0.00645416 -0.0684847
-0.0230381 0.110983 -0.0737035 0.0876777 0.222123 -0.0599873 -0.108719
0.0331109 -0.0250217 -0.00255763 0.120773 -0.0589676 -0.0846268 0.0772896
0.289120 0.261188 -0.000185005 0.115550 0.144119 2.65656e-06 -6.82831e-06
-2.76466e-05  0.00000 1.83385e-05 7.87228e-07 0.750014 -0.124224 0.330867
0.212670 0.0477592 0.0298445 -0.103084 0.0371768 0.0264019 0.0144639
0.197132 0.0205263 -0.0404874 0.0135545 0.191202 -0.0472145 0.0466104
-0.142610 0.0108195 -0.0403564 0.0185109 -0.0348318 0.184403 -0.183997
0.000797901 -0.00138066 0.00103545 -0.00442881 0.00490172  0.00000
7.51833e-06 -3.87226e-05 6.04092e-05  0.00000 3.63510e-06 -4.49410e-06
1.42910e-06 5.74795e-06 5.03001e-07
At return
30433:    -1024.5110: 0.453352 -0.0584699 -0.0355712 0.0124974 0.0734792
0.0881850 0.0507622 0.0960430 0.230123 0.00645469 -0.0684872 -0.0230377
0.110983 -0.0737051 0.0876725 0.222122 -0.0599864 -0.108715 0.0331096
-0.0250228 -0.00255738 0.120773 -0.0589583 -0.0846263 0.0772945 0.289133
0.261192 -0.000180489 0.115545 0.144090  0.00000 -1.01443e-06 -2.72331e-06
4.36692e-09 2.82744e-07 1.08531e-07 0.750012 -0.124223 0.330865 0.212669
0.0477582 0.0298448 -0.103086 0.0371794 0.0264002 0.0144699 0.197126
0.0205267 -0.0404964 0.0135892 0.191188 -0.0471837 0.0464530 -0.142440
0.0107821 -0.0403469 0.0186941 -0.0348872 0.184494 -0.184176 0.000119925
-0.000204327 0.000154724 -0.000663551 0.000735230 6.17347e-06 -3.20551e-06
1.16271e-05 -1.43359e-05  0.00000 1.40493e-06 -2.16589e-06 1.27346e-06
-2.57135e-06 2.74006e-08

*The model summary is as follows:*
Linear mixed model fit by REML ['lmerMod']
Formula: Stimulus.RRT ~ 1 + A * B * C + (1 + A |      Subject) + (1 + A * B
* C | Item) +
    (1 + A * B * C | Category)
   Data: edata
Control: lmerControl(optCtrl = list(maxfun = 5e+05))

REML criterion at convergence: -1024.5

Scaled residuals:
    Min      1Q  Median      3Q     Max
-4.1135 -0.6290 -0.0239  0.6658  3.4528

Random effects:
 Groups   Name                          Variance  Std.Dev. Corr

 Item     (Intercept)                   0.0079856 0.08936

          A1                     0.0021904 0.04680  -0.25

          B1                    0.0019678 0.04436  -0.16  0.07

          C1                      0.0008949 0.02991   0.08 -0.46 -0.42

          A1:B1          0.0034753 0.05895   0.25 -0.14 -0.40  0.04

          A1:C1            0.0011016 0.03319   0.52  0.51  0.13 -0.73  0.10

          B1:C1           0.0010864 0.03296   0.30 -0.50 -0.21  0.65  0.68
-0.39
          A1:B1:C1 0.0047121 0.06864   0.28  0.18 -0.04  0.57  0.25 -0.11
 0.64
 Subject  (Intercept)                   0.0218561 0.14784

          A1                     0.0048530 0.06966  -0.35

 Category (Intercept)                   0.0017573 0.04192

          A1                     0.0001050 0.01025   0.92

          B1                    0.0001028 0.01014   0.58  0.22

          C1                      0.0004833 0.02198  -0.92 -0.80 -0.71

          A1:B1          0.0014875 0.03857   0.19  0.56 -0.64 -0.09

          A1:C1            0.0001609 0.01268   0.41  0.09  0.70 -0.27 -0.69

          B1:C1           0.0014145 0.03761   0.08  0.17  0.05 -0.39  0.34
-0.67
          A1:B1:C1 0.0036161 0.06013   0.65  0.41  0.62 -0.44 -0.39  0.93
-0.65
 Residual                               0.0388540 0.19711

Number of obs: 4064, groups:  Item, 68; Subject, 64; Category, 6

Fixed effects:
                               Estimate Std. Error t value
(Intercept)                   -0.897455   0.027600  -32.52
A1                      0.002163   0.012815    0.17
B1                    -0.085623   0.038088   -2.25
C1                       0.004287   0.014429    0.30
A1:B1          -0.005849   0.027516   -0.21
A1:C1             0.112263   0.075240    1.49
B1:C1            0.014424   0.026637    0.54
A1:B1:C1 -0.275033   0.152135   -1.81

Correlation of Fixed Effects:
            (Intr) A1 B1 C1 A1:B1 A1:C B1:C1
A1   -0.017
B1   0.030  0.012
C1    -0.349 -0.216 -0.064
A1:B1  0.093  0.091 -0.271 -0.028
A1:C1  0.028  0.014  0.006 -0.230 -0.027
B1:C1  0.044  0.000 -0.001 -0.114  0.138   -0.029
A1:B1:C1  0.071  0.025  0.010 -0.036 -0.033    0.010 -0.279

*However, if I reorder the dataset, the model failed to converge:*
*edata.reordered<-edata[order(edata$Subject,edata$Item),]*

*Model2<-lmer(Stimulus.RT~
1+1+A*B*C+(1+A|Subject)+(1+A*B*C|Item)+(1+A*B*C|Category),
data=edata.reordered,verbose=2,control=lmerControl(optCtrl=list(maxfun=500000)))*

*The model failed to converge, with the following warning messages given.
The verbose results are as follows:*
npt = 77 , n =  75
rhobeg =  0.2 , rhoend =  2e-07
   0.020:  86:     -783.903;0.492578 -0.0462110 -0.0363706 -0.0207690
-0.00431843 0.00131644 -0.00481740 -0.00308283 0.502089 -0.0211198
-0.0336018 -0.0237567 -0.00155808 -0.0307396 -0.000390386 0.354830
-0.0308058 -0.0351383 -0.0137964 -0.0238208 -0.0133714 0.305507 -0.0174871
-0.0328382 -0.00155512 0.00996183 0.646703 -0.0258383 -0.00140789
-0.00146090 0.593574 -0.0176632 -0.0351208 0.611448 0.000434827 0.819591
0.735976 -0.105154 0.553306 0.948775 -0.00950248 -0.00976528 -0.0113362
-0.00949456 -0.00980388 -0.0100146 -0.00779947 0.946928 -0.00984788
-0.0101504 -0.00957164 -0.00986887 -0.00987130 -0.00960832 0.946926
-0.0101303 -0.0103410 -0.00976277 -0.00978197 -0.00937871 0.947412
-0.00988796 -0.0100200 -0.0103097 -0.0103002 0.949607 -0.0106541
-0.00943222 -0.0111189 0.947582 -0.0104333 -0.00883065 0.949409 -0.0119380
0.954371
  0.0020: 1139:     -1011.00;0.450616 -0.0596810 -0.0402272 0.0225926
0.0713467 0.103440 0.0602312 0.0721565 0.235672 0.00831575 -0.0664559
-0.0410181 0.131944 -0.0611257 0.0909471 0.234925 -0.0677961 -0.0928579
0.0491887 -0.0160378 -0.0609188 0.111951 -0.0797617 -0.0797398 0.105130
0.265564 0.270820 0.0170410 0.124769 0.199425 0.00278578 0.0268160
-0.00875111 0.0510349 0.110946 0.171222 0.758551 -0.121379 0.340244
0.289164 0.0340301 0.0299344 -0.127966 0.0945493 -0.0521814 -0.0136706
0.0808901 0.00103150 0.00478129 -0.0168369 -0.0419573 0.0154279 -0.0318584
-0.0284232 0.0113707 -0.00386646 -0.00455879 0.0266343 -0.0767034 0.0166990
0.0104083 -0.0298060 0.0336969 -0.0939880 -0.0530341 0.234749 -0.0630262
0.0236229 -0.0798100 0.0822270 -0.0883803 0.0469641 0.374075 0.0619410
0.530030
 0.00020: 2754:     -1024.04;0.453973 -0.0580443 -0.0341541 0.0128765
0.0743429 0.0876562 0.0513254 0.0961985 0.231256 0.00668522 -0.0689680
-0.0198556 0.111337 -0.0720035 0.0878909 0.222762 -0.0592635 -0.109626
0.0323470 -0.0244422 -0.00152948 0.121270 -0.0558188 -0.0840473 0.0772543
0.292009 0.262116 0.000440403 0.116106 0.141180 0.00136894 -0.00339338
-0.00197460 0.00348747 0.0105328 0.0115974 0.749943 -0.123029 0.330931
0.211777 0.0482195 0.0264188 -0.104392 0.0473197 0.0235507 0.0216781
0.184163  0.00000 0.0428526 -0.0310683 -0.175357 0.0368789 0.00217481
0.0939259  0.00000 0.0261004 -0.0313647 0.0268754 -0.122563 0.125346
0.000144427 -0.00203618 -0.000134387 -0.00154909 -0.000171925 0.0680554
-0.0371162 0.127778 -0.153718 0.0109013 -0.0471354 0.0631039 0.0408311
-0.0496230 0.0145855
 2.0e-05: 5427:     -1024.35;0.453500 -0.0588233 -0.0356824 0.0125644
0.0734100 0.0880990 0.0508688 0.0959347 0.230738 0.00618009 -0.0685843
-0.0222441 0.111017 -0.0734275 0.0869347 0.222257 -0.0600188 -0.109157
0.0332115 -0.0247808 -0.00219908 0.120720 -0.0585300 -0.0847411 0.0775846
0.290674 0.261498 -0.000106363 0.115210 0.143180 9.04281e-05 0.000517393
-0.000649503 0.000192640 0.000530472 0.00197453 0.749767 -0.124101 0.330961
0.212375 0.0487225 0.0278073 -0.102167 0.0463200 0.0243823 0.0162048
0.190955  0.00000 0.0439928 -0.0318335 -0.176979 0.0374637 0.00573705
0.0905893 0.00202642 0.0324678 -0.0734993 0.0480481 -0.190881 0.220526
0.000424046 -0.000319556 0.000485731 -0.00232483 0.00320987 0.00183945
-0.000969938 0.00300023 -0.00391019  0.00000 0.00377342 -0.00500665
0.000579337 -0.00215229 0.00124792
 2.0e-06: 6708:     -1024.35;0.453448 -0.0588097 -0.0355748 0.0125304
0.0734757 0.0881016 0.0508944 0.0958161 0.230754 0.00612699 -0.0684914
-0.0222696 0.110948 -0.0734877 0.0875664 0.222282 -0.0600764 -0.109357
0.0332592 -0.0250735 -0.00216874 0.120772 -0.0592389 -0.0846883 0.0772328
0.289264 0.261555 -0.000223966 0.115482 0.143552 1.82883e-06 8.95088e-05
0.000234107 4.38268e-06 -0.000100557 0.000268633 0.750021 -0.124161
0.330928 0.212373 0.0486257 0.0278246 -0.102272 0.0464224 0.0241039
0.0167711 0.190122  0.00000 0.0439680 -0.0319783 -0.176695 0.0374161
0.00595413 0.0898001 0.00213386 0.0324226 -0.0734879 0.0479364 -0.190868
0.220519 0.000515865 -0.000947565 0.000694358 -0.00286088 0.00324899
 0.00000 9.25677e-05 -0.000493308 0.000515791  0.00000 3.22998e-05
1.34428e-06 0.000265348 -0.000330170  0.00000
 2.0e-07: 8013:     -1024.35;0.453452 -0.0588158 -0.0355775 0.0125249
0.0734807 0.0881052 0.0508818 0.0958264 0.230755 0.00613428 -0.0684918
-0.0222859 0.110940 -0.0735018 0.0875183 0.222287 -0.0600648 -0.109338
0.0332510 -0.0250861 -0.00212940 0.120774 -0.0592879 -0.0846939 0.0772195
0.289318 0.261524 -0.000225946 0.115515 0.143581 2.96479e-06 4.74658e-07
-4.67167e-06 6.32876e-07 -1.68484e-05 1.97228e-05 0.750021 -0.124164
0.330935 0.212387 0.0486304 0.0278256 -0.102275 0.0464074 0.0241141
0.0168024 0.190120  0.00000 0.0439746 -0.0319812 -0.176684 0.0373949
0.00599897 0.0897709 0.00214402 0.0324037 -0.0735321 0.0479302 -0.190866
0.220458 0.000500648 -0.000957101 0.000681481 -0.00281640 0.00318786
3.86116e-05 -1.87433e-05 6.35711e-05 -8.10232e-05 2.14600e-06 -9.10076e-06
1.52841e-05 1.48079e-05 -2.33962e-05 4.43378e-06
At return
9308:    -1024.3493: 0.453452 -0.0588161 -0.0355779 0.0125253 0.0734813
0.0881053 0.0508814 0.0958291 0.230755 0.00613352 -0.0684914 -0.0222835
0.110941 -0.0734998 0.0875198 0.222286 -0.0600641 -0.109337 0.0332509
-0.0250859 -0.00212406 0.120774 -0.0592844 -0.0846942 0.0772196 0.289324
0.261524 -0.000224812 0.115519 0.143582  0.00000 6.69837e-07 2.90331e-06
5.81219e-07 -6.03136e-07 3.60563e-06 0.750021 -0.124164 0.330934 0.212389
0.0486307 0.0278253 -0.102276 0.0464083 0.0241164 0.0168007 0.190123
 0.00000 0.0439749 -0.0319808 -0.176684 0.0373947 0.00599638 0.0897735
0.00214274 0.0324038 -0.0735262 0.0479285 -0.190864 0.220444 0.000501759
-0.000958368 0.000680990 -0.00282345 0.00319184 6.93762e-06 -2.82172e-06
7.79924e-06 -1.11704e-05 1.20920e-08 2.23002e-06 -2.13124e-06 2.39686e-06
-2.38538e-06 8.82173e-08
*Warning messages:*
*1: In checkConv(attr(opt, "derivs"), opt$par, ctrl = control$checkConv,  :*
*  Model failed to converge with max|grad| = 0.761054 (tol = 0.002,
component 54)*
*2: In checkConv(attr(opt, "derivs"), opt$par, ctrl = control$checkConv,  :*
*  Model failed to converge: degenerate  Hessian with 1 negative
eigenvalues*

*The model summary is as follows:*
Linear mixed model fit by REML ['lmerMod']
Formula: Stimulus.RRT ~ 1 + RanOrCat * Direction * Running + (1 + RanOrCat
|      Subject) + (1 + RanOrCat * Direction * Running | Item) +
    (1 + RanOrCat * Direction * Running | Category)
   Data: Edata.Correct.NoBadItems.reordered
Control: lmerControl(optCtrl = list(maxfun = 5e+05))

REML criterion at convergence: -1024.3

Scaled residuals:
    Min      1Q  Median      3Q     Max
-4.1137 -0.6293 -0.0207  0.6657  3.4540

Random effects:
 Groups   Name                          Variance  Std.Dev. Corr

 Item     (Intercept)                   7.989e-03 0.089382

          RanOrCat1                     2.203e-03 0.046939 -0.25

          Direction1                    1.970e-03 0.044390 -0.16  0.07

          Running1                      8.953e-04 0.029921  0.08 -0.46
-0.42
          RanOrCat1:Direction1          3.488e-03 0.059055  0.25 -0.13
-0.40  0.04
          RanOrCat1:Running1            1.101e-03 0.033188  0.52  0.51
 0.13 -0.73  0.11
          Direction1:Running1           1.085e-03 0.032941  0.30 -0.50
-0.21  0.65  0.67 -0.39
          RanOrCat1:Direction1:Running1 4.708e-03 0.068615  0.28  0.18
-0.04  0.57  0.25 -0.11  0.64
 Subject  (Intercept)                   2.186e-02 0.147840

          RanOrCat1                     4.854e-03 0.069672 -0.35

 Category (Intercept)                   1.753e-03 0.041865

          RanOrCat1                     9.189e-05 0.009586  1.00

          Direction1                    1.054e-04 0.010266  0.53  0.53

          Running1                      4.870e-04 0.022067 -0.91 -0.91
-0.72
          RanOrCat1:Direction1          1.507e-03 0.038816  0.24  0.24
-0.65 -0.07
          RanOrCat1:Running1            1.662e-04 0.012892  0.37  0.37
 0.71 -0.29 -0.70
          Direction1:Running1           1.428e-03 0.037790  0.09  0.09
 0.03 -0.38  0.36 -0.68
          RanOrCat1:Direction1:Running1 3.606e-03 0.060051  0.62  0.62
 0.61 -0.44 -0.39  0.93 -0.66
 Residual                               3.885e-02 0.197114

Number of obs: 4064, groups:  Item, 68; Subject, 64; Category, 6

Fixed effects:
                               Estimate Std. Error t value
(Intercept)                   -0.897433   0.027587  -32.53
RanOrCat1                      0.002115   0.012737    0.17
Direction1                    -0.085626   0.038095   -2.25
Running1                       0.004295   0.014451    0.30
RanOrCat1:Direction1          -0.005863   0.027579   -0.21
RanOrCat1:Running1             0.112270   0.075247    1.49
Direction1:Running1            0.014433   0.026681    0.54
RanOrCat1:Direction1:Running1 -0.275019   0.152130   -1.81

Correlation of Fixed Effects:
            (Intr) RnOrC1 Drctn1 Rnnng1 RnOC1:D1 ROC1:R Dr1:R1
RanOrCat1   -0.014
Direction1   0.028  0.023
Running1    -0.345 -0.229 -0.065
RnOrCt1:Dr1  0.109  0.028 -0.272 -0.020
RnOrCt1:Rn1  0.027  0.020  0.007 -0.231 -0.028
Drctn1:Rnn1  0.049 -0.016 -0.002 -0.111  0.145   -0.030
RnOC1:D1:R1  0.068  0.034  0.010 -0.036 -0.032    0.010 -0.279

*If we compare the two model summaries, we can see that the model summaries
are really similar, with very little differences in values. Nevertheless,
because Model2 failed to converge, I don't know if the model summary is
still reliable and ok to report.*

*Thank you for your time and I truly welcome and appreciate your comments.*

Best,
Zhaohong
#
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
On 15-01-07 07:13 PM, Zhaohong wrote:
We have seen one case where the order does change the results slightly
(https://github.com/lme4/lme4/issues/262) , and have a
not-yet-reproducible report (see prev link) of a case where the order
changes the standard error estimates considerably more.  That the
order affects the results is surprising, but believable; I/we haven't
had a chance yet to dig through and figure out how the ordering could
change the linear algebra, but clearly it does.

   A couple of comments:

* the factor names are different in your two examples (A/B/C vs
RanOrCt/Direction/Running etc.) -- that makes me mildly suspicious that
there might be some other difference in the input data, but maybe you
just forgot to rewrite something.

   Probably the simplest way to double check these results is to do an
experiment where you input the results from each case as starting values
(e.g. update(model1,start=getME(model2,"theta"))) and see what happens.
Hopefully starting from each of those (very similar) starting points will
get you to the same starting point.

  It would also be worth trying a different optimizer (see
https://rpubs.com/bbolker/lme4trouble1 for examples).

  Having a random effect with only 6 levels (Category) is pushing the
envelope a bit, especially as you're trying to fit an 8x8
variance-covariance
matrix ((A*B*C|Category)); you might try recasting that as a fixed effect.
From comparing with the results below, it looks like the first report
(at step "0.020: 86") is identical between versions, but that the two
versions have diverged slightly by the second report ("0.0020: 1139" in
one case, "0.0020: 1266" in the other)
-0.04  0.57  0.25
- -0.11  0.64
0.61 -0.44 -0.39  0.93
- -0.66
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.11 (GNU/Linux)

iQEcBAEBAgAGBQJUrgPHAAoJEOCV5YRblxUH6qIIAJYKp9Djg4a8mf22WqIJ7tE6
/GzQTfdknwQgXwvMHbUP47YEE/mU0ms92zWwj6gvwKG+NsHmGmOrn+qGgqCS9LzC
fI1ou5nYLrcSb2If+aXPWkQ2VYKbxyCh+ycuwbTqV5tz2LAEujbZYCKck+ZSXXFS
UbazesAW618+A7VRIUWIuUtDSBA6qSchXjSZTsrosmCCS8JVtI5N2eOR/uCP3atn
SPOPSPUy33hZaMLzZHpzSM6EV/nC9CAJGTaBtiSCiWbJbMeQiHiHJ9Nyk65sPoSS
XOFfAl5FffLao6QWOWMRkD1q8eF8Y8hdagpniClxEvDc793zewzWALpBtXAgI0A=
=VEtd
-----END PGP SIGNATURE-----
#
Hi,

Does it really change the lineary algebra, or does it change its
numerical implementation in computers? Since computer addition is not
commutative, that would explain ordering effects...
[1] 1.110223e-15
[1] 1e-15
version.string R version 3.0.1 (2013-05-16)

In other words, may be computation with such big matrice can lead to
rounding errors that add-up quickly if matrices are somehow
ill-conditionned?

I'm not familiar with this field, but for instance I noticed that when
you try to compute linear model matrices, coefficients and so one
using the projection formula, even with as few as a tenth of values,
values that should be 0 in the projection matrices (and especially
inverses) are quickly around ~10^-16 and after that, things can get
very bad quiclky, which is solved by adding a ? cleaning ? of matrices
such as m[ abs( m ) < 1e^-10 ] <- 0...

Of course, the above approach is very very dirty and inefficient and
has only pedagogical interest, but are the more subtle algorithm
robust enough on that aspect for such big matrices?
On Wed, Jan 07, 2015 at 11:12:55PM -0500, Ben Bolker wrote:
?   We have seen one case where the order does change the results slightly
? (https://github.com/lme4/lme4/issues/262) , and have a
? not-yet-reproducible report (see prev link) of a case where the order
? changes the standard error estimates considerably more.  That the
? order affects the results is surprising, but believable; I/we haven't
? had a chance yet to dig through and figure out how the ordering could
? change the linear algebra, but clearly it does.
#
On floating points in R, an enjoyable discussion of this "Circle 1" of the inferno is in Patrick Burn's The R Inferno 

http://www.burns-stat.com/pages/Tutor/R_inferno.pdf

-----Original Message-----
From: R-sig-mixed-models [mailto:r-sig-mixed-models-bounces at r-project.org] On Behalf Of Emmanuel Curis
Sent: Thursday, January 08, 2015 11:00 AM
To: Ben Bolker
Cc: r-sig-mixed-models at r-project.org
Subject: Re: [R-sig-ME] [lme4 package] Does the order of rows(trials) in a data frame (long format) affect the results of the lmer model (maybe somehow)?

Hi,

Does it really change the lineary algebra, or does it change its numerical implementation in computers? Since computer addition is not commutative, that would explain ordering effects...
[1] 1.110223e-15
[1] 1e-15
version.string R version 3.0.1 (2013-05-16)

In other words, may be computation with such big matrice can lead to rounding errors that add-up quickly if matrices are somehow ill-conditionned?

I'm not familiar with this field, but for instance I noticed that when you try to compute linear model matrices, coefficients and so one using the projection formula, even with as few as a tenth of values, values that should be 0 in the projection matrices (and especially
inverses) are quickly around ~10^-16 and after that, things can get very bad quiclky, which is solved by adding a < cleaning > of matrices such as m[ abs( m ) < 1e^-10 ] <- 0...

Of course, the above approach is very very dirty and inefficient and has only pedagogical interest, but are the more subtle algorithm robust enough on that aspect for such big matrices?
On Wed, Jan 07, 2015 at 11:12:55PM -0500, Ben Bolker wrote:
<   We have seen one case where the order does change the results slightly
< (https://github.com/lme4/lme4/issues/262) , and have a < not-yet-reproducible report (see prev link) of a case where the order < changes the standard error estimates considerably more.  That the < order affects the results is surprising, but believable; I/we haven't < had a chance yet to dig through and figure out how the ordering could < change the linear algebra, but clearly it does.