Quantcast
Channel: Statalist
Viewing all 72772 articles
Browse latest View live

Collapse data

$
0
0
Hi,

Having monthly panel data, I want to convert from monthly to quarterly. Used these following options,

gen month = substr(E,1,2)
destring month, replace
destring D, replace
gen qdate = qofd(dofm(ym(D, month)))
format %tq qdate

But after that I run the Collapse option "collapse pr, by (ID qdate)" , it shows the error,


"B-D. (Mining And Quarrying; Manufacturing; Electricity, Gas, Steam And Air Conditioning Supply)" 2010 "01-January" "56.8" 1 1 200
"B-D. (Mining And Quarrying; Manufacturing; Electricity, Gas, Steam And Air Conditioning Supply)" 2010 "03-March" "68.09999999999999" 1 3 200
"B-D. (Mining And Quarrying; Manufacturing; Electricity, Gas, Steam And Air Conditioning Supply)" 2010 "02-February" "57.5" 1 2 200
"B-D. (Mining And Quarrying; Manufacturing; Electricity, Gas, Steam And Air Conditioning Supply)" 2010 "06-June" "72.2" 1 6 201
"B-D. (Mining And Quarrying; Manufacturing; Electricity, Gas, Steam And Air Conditioning Supply)" 2010 "05-May" "69.09999999999999" 1 5 201
"B-D. (Mining And Quarrying; Manufacturing; Electricity, Gas, Steam And Air Conditioning Supply)" 2010 "04-April" "66.09999999999999" 1 4 201
"B-D. (Mining And Quarrying; Manufacturing; Electricity, Gas, Steam And Air Conditioning Supply)" 2010 "07-July" "72.7" 1 7 202
"B-D. (Mining And Quarrying; Manufacturing; Electricity, Gas, Steam And Air Conditioning Supply)" 2010 "08-August" "70" 1 8 202
"B-D. (Mining And Quarrying; Manufacturing; Electricity, Gas, Steam And Air Conditioning Supply)" 2010 "09-September" "67.7" 1 9 202
"B-D. (Mining And Quarrying; Manufacturing; Electricity, Gas, Steam And Air Conditioning Supply)" 2010 "12-December" "85.09999999999999" 1 12 203
"B-D. (Mining And Quarrying; Manufacturing; Electricity, Gas, Steam And Air Conditioning Supply)" 2010 "11-November" "70" 1 11 203
"B-D. (Mining And Quarrying; Manufacturing; Electricity, Gas, Steam And Air Conditioning Supply)" 2010 "10-October" "77.3" 1 10 203
"B-D. (Mining And Quarrying; Manufacturing; Electricity, Gas, Steam And Air Conditioning Supply)" 2011 "01-January" "70.7" 1 1 204
"B-D. (Mining And Quarrying; Manufacturing; Electricity, Gas, Steam And Air Conditioning Supply)" 2011 "03-March" "79" 1 3 204
"B-D. (Mining And Quarrying; Manufacturing; Electricity, Gas, Steam And Air Conditioning Supply)" 2011 "02-February" "68.5" 1 2 204
"B-D. (Mining And Quarrying; Manufacturing; Electricity, Gas, Steam And Air Conditioning Supply)" 2011 "05-May" "79.5" 1 5 205

How can I solve this problem? Pls, help me!

How can I handle "id: maxvar too small" in Bayesian binary item response theory model using bayesmh

$
0
0
Please house I need help on this, I have gone through manual, blog stata and past statalist post.
Thanks.

Extracting model fit for a pre-specified logistic model.

$
0
0
Hi all,

I want to establish a (logistic regression) model with regards to disease relapse within the first year of a certain treatment and assess performance.

There was missing data, thus I imputed multiple data sets. I established a model in each of these sets, and chose predictors for a final model. The final model has 6 predictors and looks like below
"logit relapse duration esr hb sex mh_card mh_malign"

The first problem occurs, when I want to bootstrap validate the model.
For this I fit a model with the 6 predictors in a bsample (from 1 of the imputed data sets).
The model fit in the bsample and model coefficients were extracted.
These model coefficients then have to be applied in the original sample, and model fit (likelihood) and predicted probabilities have to be extracted.
This is where I encountered a problem: in forcing/pre-specifying the entire model with coefficients in the original sample, and extracting not just the predictions, but also model likelihood.

The second problem is in line with the first. This is when I want to adjust model coefficients based on a shrinkage factor, and extract likelihood and predictions.
- so as an example: the regression coefficient of 'esr' is 1.5, we multiply this by 0.90 (shrinkage factor) and we get a (adjusted) coeffient of 1.35.
- now we want to use (all of the) adjusted coefficient(s), so in the example the 1.35 and extract likelihood and predictions of this model.

I am using STATA/IC version 13.1 for windows.

Code:
*the basical model for the first imputed data set
logit  relapse duration esr hb sex  mh_card mh_malign if _mi_m==1
* below for getting a heuristic uniform shrinkage factor, and adjust coefficients by that
generate df = e(df_m)
generate shrinkage = (Chi-df)/Chi
foreach var of varlist B_* {
                generate adj`var' = `var' * shrinkage
                * here B_ are the (unadjusted) regression coefficients of the final (original) model (previously saved as variables)
}
 
* below is the prediction model for the first imputed data set (_mi_m==1) and the process i (tried to) use for extracting the model performance i want
local modelv duration esr hb sex  mh_card mh_malign
qui logit relapse `modelv' if _mi_m==1 // here we want to use our own, pre-determined, coefficients/model
sca n_obs = e(N)
sca llF = e(ll)       // this is what i can't seem to calculate
sca llR = e(ll_0) // this can be calculated using an intercept only model
sca r2ml = 1 - exp((2*(llR-llF)/n_obs))
sca r2cu = (r2ml)/(1-exp(2*llR/n_obs))
qui generate R2BO = r2cu            //  nagelkerke's R2
qui predict predictedBO if _mi_m==1 // for performance based on predictions                
qui brier relapse predictedBO // Brier score performance e.g.
With kind regards,

Thomas Bolhuis

Multilevel modeling- testing further the cross-level interaction terms after mnbreg

$
0
0
Hi,
I I highly appreciate any help in probing the significant cross-level interaction terms in this model. I have a multilevel model( individuals nested in organizational units). The IV1 is at unit level and moderators and two DVs are at individual level. I use multilevel SEM with Negative binomial link function in stata to test it. DV s are count variables (and have possibility of correlations).

gsem ( DV1 <- IVs Moderator IV1*Moderator M1[unit]) ( DV2 <- IVs Moderator IV1*Moderator M1[unit]), nbreg

A cross-level direct effect (effect of IV at unit level is significant on DV at individual level) and a cross-level interaction( IV at unit level, moderator and DVs at individual level) are significant. I have been asked to explain further and test this cross level interaction after running this multilevel negative binomial model.
  1. My approach with an ordinary regression is to test and plot the simple slopes for the relationship between IV and DV at one standard deviation above and below the mean of the moderator. Is this correct after negative binomial regression? Do I have to do anything further related to multilevel nature of the model?
  2. How can I get the predicted values for two DVs after gsem,nbreg? or after mnbreg?
  3. We can use margin command in stata for logit probit models in stata to test the interactions and plot them. 2-a)Can I use the such a command after multilevel negative binomial? 2-b) what command I can use after gsem with nbreg link to get the margins?
Thanks in advance,
Saeede

mnsp command to execute R in Stata

$
0
0
Hello,

I'd like to estimate propensity scores for multiple treatments and try to use the mnps command in Stata (https://www.rand.org/statistics/twang/downloads.html). I have followed the codes suggested by online tutorial. However, I am facing the following error: Fatal error: you must specify '--save', '--no-save' or '--vanilla'. I would be grateful if you could advise something to address this issue.

Code:
.

global ado "/Users/smp462/Desktop/State/twang/ado files"
adopath + "$ado"

mnps st $demo $race $income $edu $married $dual $living $area $health1 $health2 i.year, ///
   ntrees(3000) intdepth(3) shrinkage(0.01) ///
   stopmethod(es.mean ks.mean) estimand(ATE) ///
   rcmd(/usr/local/bin/r/) ///
   objpath(/Users/smp462/Desktop/Stata/twang/output) ///
   plotname(/Users/smp462/Desktop/Stata/twang/mnps_example_plot.pdf)

Running R script, please wait...

ARGUMENT '/Users/smp462/Desktop/Stata/twang/output/mnps.R' __ignored__

Fatal error: you must specify '--save', '--no-save' or '--vanilla'
Error: R did not complete successfully.
file /Users/smp462/Desktop/Stata/twang/output/mnps.Rout not found
r(601);
FYI, the following is the codes suggested by online tutorial (https://www.rand.org/statistics/twan...-tutorial.html).
Code:
mnps treat illact crimjust subprob subdep white, ///
ntrees(3000) intdepth(3) shrinkage(0.01) ///
stopmethod(es.mean ks.mean) estimand(ATE) ///
rcmd(C:\Program Files\R\R-3.3.1\bin\Rscript.exe) ///
objpath(C:\Users\username\twang\output) ///
plotname(C:\users\username\twang\output\mnps_example_plot.pdf)
save C:\Users\username\twang\output\aod_ate_wgts, replace
Thank you in advance!

Create a continuous variable for blood pressure using two variables

$
0
0
Dear List,

I wonder if any of you know how to make a continuous blood pressure variable from two variables? I have continuous variables for systolic and diastolic pressure: systolic_bp and diastolic_bp. These are expressed by the following ratio: systolic over diastolic pressure (for instance 145/83). I want to create a continuous variable, bp, that looks like this (example):
systolic_bp diastolic_bp bp
180 95 180/95
140 90 140/90
137 75 137/75
etc...
Any suggestions?

Best regards,
Sigrid

How to convert text label into numerical data

$
0
0

Because these variables are in text, I can draw graph. How can I convert these into numerical data?
Array

Creating Fixed effect

$
0
0
Hey, i have a panel data, i need to add a Race_grade fixed effect and i have two separate variables grade and race. In order to create Race_grade fixed effect i used gen Race_grade = race*grade, seems like it worked but i was also considering to use this code egen Race_grade= group(Race grade). Which of the codes is better and if there is any difference between them?

Using asdoc to export two-sample t-test by groups and other summary statistics

$
0
0
Hi,

I would like to export summary statistics (two sample t-test of the difference of the two groups, as well as mean and sd by groups) using asdoc for a number of variables (lnassets roi lev cagr vol cfvol lncash delist), based on the dummy variable BClaw. BClaw is equal to 1 if the state that firm (gvkey) is incorporated in (incorpn) has passed a certain law. Here is an example of my data:

Code:
* Example generated by -dataex-. To install: ssc install dataex
clear
input long(gvkey incorpn datadate) float(BClaw lnassets roi lev cagr vol cfvol lncash delist)
1003  9  8400 1   1.7284646    .1649503  .2224787          .         .          .   -1.1615521 0
1003  9  8765 1   2.1434722    .1231094 .14069645          .         .          .     .7045816 0
1003  9  9131 1   2.1091218   .04696032 .11527728          .  .3243487   .1695239   -1.3205066 0
1003  9  9527 1    2.638343  .016869193 .33595425  35.431084 .22307177          .    -5.298317 0
1003  9  9892 1    2.680062    .0543672  .2579871   19.58573 .13903955  .05188768   -1.4229584 0
1003  9 10257 1   2.7752104  .011594565  .3433487  24.860825  .3012686  .06689375    -.7444405 0
1003  9 10623 1    2.789937   -.4814496  .4748157   5.183002 .25959867  .11679138   -1.1973282 1
1003  9 10988 1    2.313426  -.02186171  .4476209 -11.503928         .          .            . 0
1004  9  6360 1    3.818811   .03953897  .4840834          .   .447139          .     .6941467 0
1004  9  6725 1    4.034276  .035662454   .474585          .  .3880906          .     .9764447 0
1004  9  7090 1   4.1619096   .04681123  .4885971          .  .4276877          .     .9321641 0
1004  9  7456 1   4.3007894   .05044876 .46216005  17.428518  .4554453          .    1.1226543 0
1004  9  7821 1    4.419744   .04340656   .467108   13.71092  .4464859          .    1.1098819 0
1004  9  8186 1     4.73315  .010778422  .3832191   20.97496  .4508259  .14559905     .5590442 0
1004  9  8551 1   4.7121215  .025115017  .3120642  14.695506 .43949175  .03337184 -.0020020027 0
1004  9  8917 1    4.921644   .03269741 .11526802  18.210882  .2905666  .01543511    .18481843 0
1004  9  9282 1    5.046035   .05837006 .18887423   10.99277  .2757429  .04981864     .8006554 0
1004  9  9647 1    5.289715     .058027  .2869578   21.23145  .3445867  .07587934    1.2610146 0
1004  9 10012 1    5.459973   .06534065 .20540556  19.655066  .3224897 .033278447      .958967 0
1004  9 10378 1    5.652307   .07451535  .2549553    22.3959  .4922575  .06017629            . 0
1004  9 10743 1    5.876029   .06962577 .26930535    21.5843  .2821075 .005485314    1.5184186 0
1004  9 11108 1    5.962347   .06603247  .2732156  18.229586  .3325457  .01752719    1.3972343 0
1004  9 11473 1    5.940061   .03895431 .22490117   10.06688  .5263988  .04419932     .4401886 0
1004  9 11839 1    5.979774   .02534457 .23353425   3.518675  .3404748  .02192023     .8109302 0
1004  9 12204 1    5.900311 .0007750219 .25009653 -2.0466413 .28526157  .01651055       .81315 0
1004  9 12569 1    6.034586   .02273326 .27847165  3.2010174 .22665544 .035685968    2.8944745 0
1004  9 12934 1    6.054003   .02457176 .28509632   2.505153  .2345718  .03266471    3.1129375 0
1004  9 13300 1    6.081867  .036569934 .27353454   6.238753 .27757648 .015314956     3.514705 0
1004  9 13665 1    6.272092   .04347752  .2233678   8.238668  .3334143 .005712112    3.9455545 0
1004  9 14030 1    6.508111   .05317504  .2650714  16.342669   .199219  .04456652    2.8461876 0
1004  9 14395 1    6.588418   .05734831  .2495892   18.39426  .4494495 .011976158    2.1102133 0
1004  9 14761 1    6.607998   .04745357 .27903044  11.847786   .534973  .02221332     .2159175 0
1004  9 15126 1    6.553725   .02640293  .2758964  1.5320748  .5737165   .0482364    2.6253204 0
1004  9 15491 1    6.565545  -.08298942    .36641  -.7595075  .4977813  .06240025     3.541597 0
1004  9 15856 1    6.531783 -.018074017  .3741715  -2.508516   .780392  .02444072     3.372592 0
1004  9 16222 1    6.564267  .004940137  .3553656   .3520143  .5960131          .     3.713816 0
1004  9 16587 1    6.596095  .021104025  .3153435  1.0235178 .42082205 .018312065    3.7014995 0
1004  9 16952 1    6.886347  .035923906  .3278083  12.545578  .3538446 .027473735     4.801871 0
1008  6  8917 0    .4317824   -.5746753 .05714286          .  .5316426   .7218414   -2.0402207 0
1008  6  9282 0 -.027371196  -.55806786  .1839671          . .45749855 .005388222   -1.5232602 0
1008  6  9647 0  -.03562718    -.822798 .03626943          .  .7069297   .1002469   -1.1394343 0
1009  9  7974 1   3.4354055  -.20596573  .5390736          .         .          .   -1.1647521 0
1009  9  8339 1    3.246063   -.2466036  .6598155          .  .2062948          .    -1.214023 0
1009  9  8704 1    2.089392   -.3690594  .8872524          . .54315466   .1629517   -1.4481697 0
1009  9  9070 1    2.182562   .24873154  .7358214 -34.138393  .3064029 .072737806   -1.0216513 0
1009  9  9435 1   2.3560312    .2854299  .5620438 -25.671616  .3166397  .05993561   -2.5383074 0
1009  9  9800 1    2.650421   .18368644  .6327683    20.5639  .4450549  .06218425   -1.3823023 0
1009  9 10165 1    2.853247   .06688192  .6534248   25.05226  .4495864  .04535501    -6.214608 0
1009  9 10531 1     2.78606   .09280385  .4867115   15.41256  .9953872 .024326267    -.9493306 0
1009  9 10896 1    3.261514   .07528077   .633677  22.592733  .5440766 .030183265    -3.912023 0
1009  9 11261 1     3.47615   .06602752   .648647   23.07633  .6286498 .021390636    -6.214608 0
1009  9 11626 1    3.571193  .030484546  .5955454  29.915113  .6767214  .01319791    -3.036554 0
1009  9 11992 1    3.737098   .05688965 .51465124  17.178484  .5621175  .04781688   -2.4769385 0
1009  9 12357 1   4.1588364   .05253371  .5911996   25.55353  .6286818  .02405528    -.6674795 0
1009  9 12722 1    4.541282   .05175299  .6292439   38.17668  .4745038 .012205282    -1.838851 0
1010 32  6209 1    6.587965   .04684972  .3442347          .  .2205974  .02286955     2.329519 0
1010 32  6574 1    6.670539    .0455249  .3435811          .    .19338 .009634356    2.3657477 0
1010 32  6939 1    6.830786   .04450718  .3647299          . .22305876 .015862195     2.409644 0
1010 32  7304 1    6.924412   .04705074  .3582557  11.867963  .2194916 .022876967    2.4697084 0
1010 32  7670 1    7.011654   .04033933  .3607069  12.042136  .3019638 .014513645    2.3791757 0
1010 32  8035 1    7.123317   .03830934  .3727295   10.24231 .21827044          .     2.485906 0
1010 32  8400 1      7.0902  .027559934  .3910101    5.68183 .27717042 .015707677    2.5732226 0
1010 32  8765 1    7.067373 .0015736594   .375305  1.8746475  .3048166 .009196619     2.674631 0
1010 32  9131 1    7.059311  .027731873   .748532 -2.1109502         .   .0995543    2.6055005 0
1010 32  9496 1    7.301307  .021279337  .7404664   7.290397         .  .04167838     2.710647 0
1010 32  9861 1    7.306142 -.006740879  .7721482   8.284278         .  .04845259     .7011154 0
1010 32 10226 1    7.579686   .00269661  .6050774  18.941116         .  .15194876    2.8202474 0
1010 32 10592 1    7.595028  .007575373   .623316   10.28601         .   .0235544     3.387166 0
1010 32 10957 1     7.49831   .15727852  .6786623     6.6152         .  .19623864     6.025946 0
1010 32 11322 1    7.455234  -.04354649  .6556966   -4.06354         . .018232454     3.346495 0
1010 32 11687 1    7.470943   .08937106  .6549438 -4.0517874         .  .02663246     5.003248 0
1010 32 12053 1    7.442173 -.010436848  .6046861 -1.8538333         .  .03513778     4.384436 0
1010 32 12418 1    7.418133   .06737955  .4544579 -1.2290614         . .032858614     2.492048 0
1010 32 12783 1    7.503896  .013883533  .4523167  1.1044841         . .016176423    4.0707345 0
1010 32 13148 1    7.608771   .04181962 .50932634   5.710376         .          .            . 0
1010 32 13514 1    7.704632   .04164788  .4888669  10.020818         .          .            . 0
1010 32 13879 1    8.065045  .068022504 .56775534   20.56873         .          .            . 0
1010 32 14244 1    8.088654  .020630583  .5540171   17.34649         .          .            . 0
1010 32 14609 1    8.178471  .021692766 .55483526  17.110325         .          .            . 0
1010 32 14975 1    8.241308  .021056794  .5383845   6.051458         .          .            . 0
1010 32 15340 1    8.222312   .03913406  .5118047   4.556005         .          .            . 0
1010 32 15705 1   8.2167635  .021525996 .53172183  1.2846186         .          .            . 0
1010 32 16070 1    8.483036   .07294965  .3849465    8.39114         .          .     6.477895 0
1011 39  8400 1   1.3373667   .06957207 .51509583          .         .          .   -4.2686977 0
1011 39  8765 1   1.5452193  .007037748 .21603754          .  .5762995 .027964767    -2.688248 0
1011 39  9131 1    1.771897  .011050663  .3356001          .  .6184222          .   -1.8773173 0
1011 39  9496 1   1.9516082   .01377841 .40823865  22.721474  .6657602 .004830414   -1.0613165 0
1011 39  9861 1   1.8721098   -.1045832 .20609044  11.512164  .7673731  .03094232    -.3797974 0
1011 39 10226 1   1.6164135  -.25878847 .28500497  -5.050762  .7141827  .04586212   -3.3242364 0
1011 39 10592 1   1.6058314   .06061823 .22581293  -10.88646 .56852126 .009239657    -2.918771 0
1011 39 10957 1    2.025645  -.10394407  .5140483   5.251068   .518975  .04730276   -1.7037486 0
1011 39 11322 1   2.0520704   -.2401079  .7569373   15.62927  .6399881  .02649085   -1.7660917 0
1011 39 11687 1   2.1656191  -.16605504  .7097477  20.514025   .672135  .04844726     .2021242 0
1011 39 12053 1    2.837498  -.12610555  .3663679   31.07739         .  .04555298    2.1559396 0
1011 39 12418 1   3.2180755  -.12642114   .336229   47.50153  .6990436 .021032084   -.04814038 0
1011 39 12783 1   4.2090416  -.13668787 .17080782   97.61306  .7365833 .036273457     .5816568 0
1012 40  6148 0   1.6646833   .09102952 .05809993          .         .          .   -4.2686977 0
1012 40  6513 0   1.8017098   .11485148 .06815182          .         .          .    -5.809143 0
1012 40  6878 0   2.1049874   .07980992 .15401487          .  .6825735          .   -2.8824036 0
1012 40  7243 0   2.0775647 -.065122105 .20338134   14.75476  .6335995          .    -2.703063 0
end
format %d datadate
label values incorpn incorpn
label def incorpn 6 "CA", modify
label def incorpn 9 "DE", modify
label def incorpn 32 "NJ", modify
label def incorpn 39 "PA", modify
label def incorpn 40 "RI", modify
The code I have been trying it with looks as follows:
Code:
asdoc ttest lnassets, by(BClaw) label stat(mean sd dif) save(sumtable.doc)
asdoc ttest roi, by(BClaw) rowappend label stat(mean sd dif) 
asdoc ttest lev, by(BClaw) rowappend label stat(mean sd dif) 
asdoc ttest cagr, by(BClaw) rowappend label stat(mean sd dif) 
asdoc ttest vol, by(BClaw) rowappend label stat(mean sd dif) 
asdoc ttest cfvol, by(BClaw) rowappend label stat(mean sd dif) 
asdoc ttest lncash, by(BClaw) rowappend label stat(mean sd dif) 
asdoc ttest delist, by(BClaw) rowappend label stat(mean sd dif)
which returns the error message:
asdoctable(): 3301 subscript invalid
<istmt>: - function returned error
r(3301);
if I simply do
Code:
asdoc ttest lnassets, by(BClaw)
it seems to work but the output contains a lot more information than I want.
Ideally I would do the above with a loop, and if possible clustering the s.e.'s around incorpn, as such:
Code:
foreach i in lnassets roi lev cagr vol cfvol lncash delist{
asdoc ttest `i', by(BClaw) rowappend label stat(mean sd dif) 
}
But I would be happy if I could get the first version to work. Any help would be appreciated!

Thanks

Merge m:m in a panel, can't merge BY two variables

$
0
0
Hello,

I have tried to look for similar queries but couldn't find any. Thanks in advance for reading this.

I have two datasets: 1) is a panel ordered by (Code, Year); 2) Is basically a table that matches the Code in 1) with finer categories.
Example: code "10" represents CEOs and Legislators in 1) and corresponds with two categories in two "11-1011" (CEOs) and "11-1031" (Legislators).

I am using the following command: merge m:m ACSCode using ONET_ACS_crosswalk

Extract of dataset 2)(ONET_ACS_crosswalk) :
Code:
* Example generated by -dataex-. To install: ssc install dataex
clear
input str7 SOC2010Code str112 NationalEmploymentMatrixSOCO int ACSCode str128 ACSOccupationalTitle
"11-1031" "Legislators"      10 "Chief executives and legislators"
"11-1011" "Chief executives" 10 "Chief executives and legislators"
end
I would like to obtain a panel still ordered by (Code, Year) but having each code "10" corresponding to two categories. I'm having trouble in using merge m:m in order to match the two: Stata matches the first two rows with the proper categories but thereafter chooses just one of the two categories for all the rest of the couple (Code, Year). I will try to give you an idea here below with dataex, using the same example used before (Code "10" - CEOs and Legislators):
Code:
* Example generated by -dataex-. To install: ssc install dataex
clear
input int ACSCode str156 ACSTitle int year str7 SOC2010Code str112 NationalEmploymentMatrixSOCO
10 "Chief executives and legislators" 2006 "11-1031" "Legislators"     
10 "Chief executives and legislators" 2006 "11-1011" "Chief executives"
10 "Chief executives and legislators" 2006 "11-1011" "Chief executives"
10 "Chief executives and legislators" 2006 "11-1011" "Chief executives"
....
10 "Chief executives and legislators" 2007 "11-1011" "Chief executives"
10 "Chief executives and legislators" 2007 "11-1011" "Chief executives"
10 "Chief executives and legislators" 2007 "11-1011" "Chief executives"
10 "Chief executives and legislators" 2007 "11-1011" "Chief executives"
....
end
The problem is the following: when it comes to the next year (2007), code "10" will only be matched by "11-1011". I would need, somehow, to merge BY (ACSCode Year), so to obtain for each (ACSCode Year) both categories "11-1011" and "11-1013".
I hope I have been somehow clear, I am sorry if I was not. Please ask me how to clarify if needed, any criticism or suggestion is highly appreciated.

Thanks,

Francesco

Deriving change over time by change in groups of variables

$
0
0
I'm running a regression to try to explain how long people spend in a hospital's A&E (ER) department. The data contains the data on several million individual patients who were in hospital over two years. The basic structure is as follows:

Code:
 reg MinsinAE i.sex#i.agegroup i.diagnosis i.had_CT i.hadMRI i.hadCT i.hadbloods hospitalbedcapacity
Over the two year period the average time in A&E has increased so we want to see how each of the variables might explain that. To do this I need to combine the coefficients from the regression with the change in the mean of the variables to see how much of the change in the predicted time in A&E could be attributed to each variable.

I've tried using margins but I'm not sure if the results I obtain are telling me what I think. Can anyone help please?

My margins command is:

Code:
margins  , over(years) at((means) _all)
I also wondered if it were possible to get the overall marginal effect of groups of variables (I have several hundred in my model). For example, the impact of all the diagnostic tests.

Many thanks
Rob

Event Study - Multiple Events in Event Window - Calculation of CAR

$
0
0
Dear all,

I am currently working on a research project where I am doing an event study on the effect of a legislative process on the return of stock listed firms. The problem that I am facing right now is that I have four events within my event window. On top of that I have two events being very close to each other (within four days). t = 0 of the first event is 100 days prior to t = 0 of the fourth event. For each event I would like to look at the event window [-5;+5].

I am having trouble on how to correctly calculate the CAR (cumulated abnormal return) of each individual firm. Do I have to calculate first the AR (abnormal return) of each event within the event window (from [-5;+5]) and then add the AR of the four events together and divide them by four again?

This is an excerpt of the times up to the four events:

Code:
 
time_event1 time_event2 time_event3 time_event4
-12 -16 -98 -112
-11 -15 -97 -111
-10 -14 -96 -110
-9 -13 -95 -109
-8 -12 -94 -108
-7 -11 -93 -107
-6 -10 -92 -106
-5 -9 -91 -105
-4 -8 -90 -104
-3 -7 -89 -103
-2 -6 -88 -102
-1 -5 -87 -101
0 -4 -86 -100
1 -3 -85 -99
2 -2 -84 -98
3 -1 -83 -97
4 0 -82 -96
5 1 -81 -95
6 2 -80 -94
7 3 -79 -93
8 4 -78 -92
9 5 -77 -91
10 6 -76 -90
11 7 -75 -89
12 8 -74 -88
13 9 -73 -87
14 10 -72 -86
Thank you very much in advance. Any help is greatly appreciated.

Regards,
Jane

Setting a loop for separate merges - add suffix to tempfiles?

$
0
0
I'm currently performing three separate merges and saving the resulting file of each merge as a temp file. After performing all of these merges, I am appending all three temp files. Here is my current code:

Code:
use `crosswalk', clear

merge 1:1 IDN using `group1'
keep if _merge==3
drop _merge

tempfile crosswalk_1
save `crosswalk_1'

use `crosswalk', clear

merge 1:1 IDN using `group2'
keep if _merge==3
drop _merge

tempfile crosswalk_2
save `crosswalk_2'

use `crosswalk', clear

merge 1:1 IDN using `group3'
keep if _merge==3
drop _merge

tempfile crosswalk_3
save `crosswalk_3'

***************************
******append files*******
***************************

use `crosswalk_1', clear
append using `crosswalk_2'
append using `crosswalk_3'
I realize that it would be more efficient to save group1-group3 as a local and then set a loop. Something like this:

Code:
use `crosswalk', clear

local files `group1' `group2' `group3' 

foreach f of local files {

    merge 1:1 IDN using `f'
    keep if _merge==3
    drop _merge

    }
What I am not sure how to do is to add a suffix to the loop, so that each resulting file is saved as crosswalk_`suffix'. I thought about creating another local with numbers 1, 2, and 3, but I do not know how to specify that each suffix only be used once, and in order. I believe that correctly specifying this would simplify my code. Advice on how to do this is appreciated. Thanks for reading.

xtmixed within and between group variance explained

$
0
0
Where is the within and between variance explained when using xtmixed? Is this displayed in the output or by using an option?
I am using a 3 level model.

Question on interpreting results of reghdfe

$
0
0
Hi all

I am working on a model to identify if there are economies of scale at hospital department level. The dataset is patient level costs for all patients at all hospitals and across all types of hospital activity. The ‘department’ size is the total number of patients treated at the same hospital with the same activity type.

The cost of treating a patient is largely determined by the type of the activity itself, so we would like to absorb these as fixed effects. There are also many hospital level unobserved heterogeneity which influence costs (such as quality of senior management) and which we would like to absorb as a fixed effect. Given the vast size of our dataset (10s of millions), the large number of activity types (100s), the large number of hospitals (100s), and limited processing capacity (16GB RAM), we thought reghdfe would be a good solution.

Code:
reghdfe log_patient_cost log_department_size `patient_controls’, absorb(hospital_ID activity_type_ID)
The idea is that the coefficient on log_department_size would identify any economies of scale. My question is this: If there are economies of scale at department level, and department size is correlated (imperfectly) with hospital size, would absorbing hospital fixed effects bias our estimate of department economies of scale?

Unfortunately we only have 1 year of data, so cannot exploit longitudinal variation to deal with time invariant hospital heterogeneity.

Sergio Correia - I would be very appreciative if you were able to have a look at this question.

Thanks!

Is there a way to modify every value label in a data set or call the value label of a variable?

$
0
0
I am working with data where the value -99 always means "Refused Question." It is currently unlabeled. Is there a way to call the value label of a variable?

I would like to run a loop something like this:

Code:
foreach var of varlist _all { 
      label define <<VALUE LABEL OF `var'>> -99 "Refused Question", modify  }
Where "<<VALUE LABEL OF `var'>>" calls the value label of the variable.

Thanks!

Multinomial Logit? Multinomial Problem or not?

$
0
0
Hi everyone

I'm need your help, because i dont know how approach this problem.

They ask me: what is the best parking lot? or what is the best block of the parking lot? (xlsx attachment)

Can I use Multinomial logit? I have the next scheme (photo attachment).

I try with mlogit street (indvar) and then the next dep variables, but fails.

And tab streets and it does well, but i dont know because doesnt work the multinomial.

Help me plss :c

PD, I'm newbie writting in english.

Plotting percentage based on other variable

$
0
0
I would like to generate a line plot depicting percentage of firms that have mcap < 100. Then I want to add another line plot depicting average mcap of those firms.


Code:
* Example generated by -dataex-. To install: ssc install dataex
clear
input int year str105 cname double mcap
1990 "COMPANHIA INDUSTRIAL DE RESINAS SINTETICAS, CIRES, S.A."      14.5568724699
1990 "JERONIMO MARTINS SGPS SA"                                   110.01034468728
1990 "SDC INVESTIMENTOS SA"                                        69.83170568471
1990 "SOMAGUE - SOCIEDADE GESTORA PARTICIPACOES SOCIAIS, S.A."      11.1730729776
1990 "PAPELARIA FERNANDES-INDUS. E COMERCIO,SA"                29.927873824080002
1990 "IMOBILIARIA CONSTRUTORA GRAO - PARA, SA"                     25.43625063597
1990 "BANCO CHEMICAL PORTUGAL SA"                                 148.33002464062
1990 "CEREXPORT-CERAMICA DE EXPORTACAO, S.A."                      34.74626150976
1990 "SUMOL COMPAL SA"                                         13.811184146190001
1990 "EFACEC CAPITAL SGPS SA"                                  50.503287008330005
1990 "MAGUE SGPS S.A."                                             70.04618868527
1990 "TOYOTA CAETANO PORTUGAL SA"                                  64.84372671861
1990 "TERTIR TERMINAIS DE PORTUGAL, SA"                             68.5435387913
1990 "MUNIDCENTER SGPS SA"                                         34.34223521314
1990 "INVESTIMENTOS, PARTICIPACOES E GESTAO, S.A."                 72.12617595738
1990 "LUSOTUR-SOCIEDADE FINANC. DE TURISMO"                        98.93032790934
1990 "CIN - CORPORACAO INDUSTRIAL DO NORTE, S.A."              26.935086391800002
1990 "BANCO BPI, S.A."                                            339.47977700276
1990 "ATLANTIS - CRISTAIS DE ALCOBACA SA"                          23.03678135693
1990 "GESTNAVE PRESTACAO DE SERVICOS INDUSTRI"                    138.54111591066
1990 "BES - INVESTIMENTO S.A."                                     92.27761102929
1990 "SOJA DE PORTUGAL SOC. GESTORA DE PART.SA"                    30.22715256232
1990 "CELULOSE DO CAIMA SGPS SA"                                    91.8845582147
1990 "CIA DE SEGUROS TRANQUILIDADE-VIDA, S.A."                    199.51915902675
1990 "BICC CELCAT, CABOS DE TELECOMUNICACOES"                  13.130854640319999
1990 "ESTORIL - SOL, S.A."                                         37.78394070291
1990 "UNICER - UNIAO CERVEJEIRA SA"                               175.07806187089
1990 "COMPANHIA PORTUGUESA DO COBRE SGPS, SA"                   8.376770133980001
1990 "FABRICAS TRIUNFO, S.A."                                      17.92595345218
1990 "BANCO TOTTA & ACORES, S.A."                              493.81281598303997
1990 "DOM PEDRO INVESTIMENTO TURISTICOS S.A."                      15.06369654134
1991 "FABRICAS TRIUNFO, S.A."                                      17.92595345218
1991 "TOYOTA CAETANO PORTUGAL SA"                                  54.86776862761
1991 "CEREXPORT-CERAMICA DE EXPORTACAO, S.A."                  25.737971488710002
1991 "CELULOSE DO CAIMA SGPS SA"                                   50.44642411788
1991 "EFACEC CAPITAL SGPS SA"                                      66.96361755715
1991 "COMPANHIA PORTUGUESA DO COBRE SGPS, SA"                       7.11319003701
1991 "BANCO BPI, S.A."                                         307.75054194766005
1991 "MUNIDCENTER SGPS SA"                                     37.065627836910004
1991 "SOJA DE PORTUGAL SOC. GESTORA DE PART.SA"                    28.43896210134
1991 "SONAE-SGSP SA"                                               105.3730900131
1991 "DOM PEDRO INVESTIMENTO TURISTICOS S.A."                      10.57451541784
1991 "CIN - CORPORACAO INDUSTRIAL DO NORTE, S.A."                  30.12739248982
1991 "ESTORIL - SOL, S.A."                                         29.47147352136
1991 "IMOBILIARIA CONSTRUTORA GRAO - PARA, SA"                     26.18437565467
1991 "FISIPE-FIBRAS SINTETICAS DE PORTUGAL SA"                 16.318789317740002
1991 "ATLANTIS - CRISTAIS DE ALCOBACA SA"                      14.884154188410001
1991 "CIA DE SEGUROS TRANQUILIDADE-VIDA, S.A."                          167.09721
1991 "MAGUE SGPS S.A."                                             79.02575984378
1991 "BANCO CHEMICAL PORTUGAL SA"                                 114.28706816572
1991 "TERTIR TERMINAIS DE PORTUGAL, SA"                             41.4935006634
1991 "UNICER - UNIAO CERVEJEIRA SA"                                    158.542995
1991 "SDC INVESTIMENTOS SA"                                        76.81487594828
1991 "LUSOTUR-SOCIEDADE FINANC. DE TURISMO"                        68.69906038719
1991 "BES - INVESTIMENTO S.A."                                     65.34252453805
1991 "BANCO PORTUGUES DO ATLANTICO SA"                            743.70766510154
1991 "INVESTIMENTOS, PARTICIPACOES E GESTAO, S.A."                   56.862960322
1991 "PAPELARIA FERNANDES-INDUS. E COMERCIO,SA"                    28.18872467354
1991 "BICC CELCAT, CABOS DE TELECOMUNICACOES"                      38.96609171896
1991 "COMPANHIA INDUSTRIAL DE RESINAS SINTETICAS, CIRES, S.A."     21.20090581698
1991 "BANCO TOTTA & ACORES, S.A."                                 781.11783615551
1991 "SUMOL COMPAL SA"                                             14.34925353125
1991 "SOMAGUE - SOCIEDADE GESTORA PARTICIPACOES SOCIAIS, S.A."     10.17547710307
1991 "JERONIMO MARTINS SGPS SA"                                   149.04081164394
1991 "GESTNAVE PRESTACAO DE SERVICOS INDUSTRI"                    103.74996259016
1992 "SACOR MARITIMA, SA"                                      29.613631148929997
1992 "CIA DE SEGUROS TRANQUILIDADE-VIDA, S.A."                    149.63936927006
1992 "CELULOSE DO CAIMA SGPS SA"                               24.394364399999997
1992 "EFACEC CAPITAL SGPS SA"                                      82.95569787546
1992 "JERONIMO MARTINS SGPS SA"                                158.61773126763998
1992 "SUMOL COMPAL SA"                                             15.98638132102
1992 "IMOBILIARIA CONSTRUTORA GRAO - PARA, SA"                     13.71562534292
1992 "PAPELARIA FERNANDES-INDUS. E COMERCIO,SA"                7.9421095160699995
1992 "SDC INVESTIMENTOS SA"                                     83.79804664742001
1992 "SOMAGUE - SOCIEDADE GESTORA PARTICIPACOES SOCIAIS, S.A."     16.46033068628
1992 "SOJA DE PORTUGAL SOC. GESTORA DE PART.SA"                     35.1652517433
1992 "BANCO PORTUGUES DO ATLANTICO SA"                          938.4780303708101
1992 "COMPANHIA INDUSTRIAL DE RESINAS SINTETICAS, CIRES, S.A."     19.35734878942
1992 "BES - INVESTIMENTO S.A."                                     49.87978960237
1992 "SONAE-SGSP SA"                                              116.96810738395
1992 "ESTORIL - SOL, S.A."                                         20.15143504155
1992 "BANCO BPI, S.A."                                            380.08681356302
1992 "DOM PEDRO INVESTIMENTO TURISTICOS S.A."                   8.180285511920001
1992 "TERTIR TERMINAIS DE PORTUGAL, SA"                            32.84883430931
1992 "MUNIDCENTER SGPS SA"                                     35.681796071469996
1992 "LUSOTUR-SOCIEDADE FINANC. DE TURISMO"                        46.01410600435
1992 "CEREXPORT-CERAMICA DE EXPORTACAO, S.A."                       8.62222044872
1992 "CIN - CORPORACAO INDUSTRIAL DO NORTE, S.A."                  35.61417029174
1992 "TOYOTA CAETANO PORTUGAL SA"                                  71.07870048184
1992 "FISIPE-FIBRAS SINTETICAS DE PORTUGAL SA"                     18.42361907802
1992 "MAGUE SGPS S.A."                                          91.28600073822001
1992 "BICC CELCAT, CABOS DE TELECOMUNICACOES"                           43.993962
1992 "GESTNAVE PRESTACAO DE SERVICOS INDUSTRI"                     40.86152372782
1992 "BANCO TOTTA & ACORES, S.A."                                 698.31705479544
1992 "INVESTIMENTOS, PARTICIPACOES E GESTAO, S.A."             31.394339573369997
1992 "BANCO CHEMICAL PORTUGAL SA"                                  96.34517861953
1992 "COMPANHIA PORTUGUESA DO COBRE SGPS, SA"                       1.68992727527
1992 "ATLANTIS - CRISTAIS DE ALCOBACA SA"                            6.5375943975
1992 "UNICER - UNIAO CERVEJEIRA SA"                               186.74993266228
end

reshape data set

$
0
0
Dear all,

How could I transfer data below (1):
Array
to (2)
Array

Here is the original data set (1)
Code:
* Example generated by -dataex-. To install: ssc install dataex
clear
input float(EAdults_Uninsured_pct_15 EAfrican_American_pct_15 EAmr_Ind_Alas_Nati_pct_15 EAsian_pct_15 EAverageInfantDeaths_15 EAverage_Daily_PM25_15)
3675.481 1005393 2175848.3 1325985 645519.5 685926.1
end
Best regards,
Jack

Plotting Percentage Shares over Continuous Variable

$
0
0
Dear Statalisters,

I have a continuous variable X (= 0 ~ 300) and a set of binary variables (say, Y, Z, W). My goal is to plot 3 curves each representing the percentage share that the binary variables equal to 1 along the continuous variable on the same graph.

I am focusing on plotting just one binary variable (say, Y) for now.

I think it is along the line as this:

Code:
graph bar (mean) Y, over(X)
Except that I would like to have a smooth (kernel like) curve with optimized or pre-determined bin widths.

I have tried to use the kdensity commands but they don't produce what I want. Any help or direction to other posts will be much appreciated.

Thanks for your attention.



Viewing all 72772 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>