multiple-regression's questions - English 1answer

3.316 multiple-regression questions.

I am new to time-series analysis. I have been struggling with the regression problem of the time series data and I have tried to find a good(correct) tutorial or blogs for guiding me through model ...

My plots look like the figure I attached, please suggest, what should I do next? My dataset has Prestige, income, education, and suicide rates for 36 occupations. The suicide rate is the response ...

My plots look like the figure I attached, please suggest, what should I do next? My dataset has Prestige, income, education, and suicide rates for 36 occupations. The suicide rate is the response ...

If a have a true model $y = \beta_0 + \beta_1 x_1 + \beta_2 x_2 + \beta_3 x_3 + \epsilon$ but $x_3$ is unobservable. What are the consequences of having a unobservable variable which correlates ...

I have three levels (suppose A,B, and C) in my experiment and wish to do within-subject regression analysis. All the 3 levels are manipulated within-subject. Also A is outcome category, B is outcome ...

Can this formula be used for calculating standard error of the regression coefficients? $$ SE(\hat{\beta}_{j})^{2} = (1 / (1 - R_{j}^{2})) * (\sigma^{2} / \sum_{i=1}^{n}(x_{ij} - \overline{x_{j}})^{2}...

I want to predict a function of several variables: f(a,b,c) which currently is modeled as a linear combination of a,b,c etc.: f=theta1*a+theata2*b+... its inputs- a,b and c may be predicted from d,e ...

I encountered a lot of references that talk about R squared but I can't understand what the difference is between the R squared in regression of the response on the predictors and the R squared that ...

I want to ask a question regarding colinearity and Variance Inflation Factor (VIF). I started with 7 variables, and have excluded one since there were not correlation between that predictor variable ...

I don't quite understand how my lecturer derives this. In linear regression we have: $y = \hat{y} + \textrm{e}$ where the variables above are nx1 vectors. Then he writes: $y^Ty = \hat{y}^T\hat{y} + ...

I'm looking into a possible topic for a school project currently. It involves looking at the S&P 500 in comparison to other indices globally (e.g., Nikkei, DAX, etc.). I currently have plotted 19 ...

I'm working through Introduction to Statistical Learning and in chapter 3 (Linear Regression), I learn that if a relationship between predictors and a response is not linear, I may use cubic ...

I want to recreate a regression model based on what was given in a scientific paper. They gave intercept and coefficient terms. I know how to create regression models in R, but is this possible to ...

Let's say I have 3 factors f1, f2 and f3 and fit these models ...

The research is about understanding the importance of 3 factors on personal ratings for some restaurants. There are 3 independent variables(Food Quality, Services, Environment) and 1 dependent ...

I hope this is not a stupid question. Let us say I have a data generation process that is quite stationary and I do not care about arriving at generalizable knowledge but more about accurate ...

I have following type of data matrix. I want to find the significance of predictor (including combined effect) variables like, y=β_0+β_1 x_1+β_2 x_2+β_3 x_3+β_4 x_1^2+β_4 x_1^2+β_5 x_2^2+β_6 x_3^2+...

While there are questions regarding regressing out (or partialling out) a predictive variable, I want to regress out the dependent variable. I hope the question makes sense. Please let me know if this ...

This might be trivial and vague question, but I still don't understand why when creating test statistics or estimators we always divide by the degree of freedom. Just to give examples of what I'm ...

I have limited statistical experience from my coursework in undergrad running simple linear regressions and performing chi-square tests. I have some data, ~5000 survey results on individuals, each ...

When calculating a panel data regression with multiple fixed effects using the felm() (of the lfe package), no constant / ...

This question is related to How to analyze curvilinear seasonal data I have data like following: ...

I am using caret package to do L2-regularized multiple linear regression modeling. I am able to train the model and get the best tuned hyper parameter (lambda). The main goal however for me is to ...

I am working on Boston Dataset in which the aim is to predict the MEDV which is median value of owner-occupied homes in $1000s....

I know there are already two questions on this topic, but neither has an answer. I have a set of $N$ experiments. For each experiment, denoted by a vector of predictors $\mathbf{x}$, I measure $m$ ...

I have a weekly sales data for a product which I have collected over past 16 years. Data is highly seasonal, cycles repeat themselves every 52 weeks. I am using python to build a forecasting model. ...

I have a dataset with stratified sampling and there is a variable called weight for each observation. I think it's the sampling weight and can I use linear model for this dataset instead go for a ...

Good morning, I have the a set of data $(\sigma,D,\alpha_0)_i$, $i=1...n$ data. I want to determine two parameters $K_{IC}$, $C_f$ in the basic equation given as $K_{IC} = \sigma \sqrt{D} k_0(\...

From what I understand, variable selection based on p-values (at least in regression context) is highly flawed. It appears variable selection based on AIC (or similar) is also considered flawed by ...

I have a null result for a negative binomial regression model and I would like to give evidence that my sample is large enough to detect even small effects. I've found a few online calculators for ...

In this page, it mentions that linear regression requires residuals to be normally distributed. Why we need this assumption, and what will happen if this assumption does not satisfy?

I have an F-statistic, $F(4,10)$, my constant and 4 Coefficients $\beta_2 , \beta_3 , \beta_4$ and $\beta_5$ I already know that the (in this case) 10 reflects the number of obsverations. But what ...

I have my OLS regression: $y = \beta_1 + \beta_2 X_2 + \beta_3 X_3 +\beta_4 (X_3)^2$ Could anybody please explain to me the effect of a change in $X_3$ on the dependent variable?(Is the effect ...

I have a data set with 5 independent variables. Is it possible to do a regression analysis without the presence of a dependent variable? ...

My research is about password safety. As dependent variable I have the number of attempts a software needed to crack the respective passwords. This results in values in a range from approx. 1e+5 to 1e+...

I have a dataset of auto thefts that has the date, day, time the thefts occurred on. My independent variables would be day of the week, month, hour of the day, etc. I want to see if auto thefts is ...

I understand that for simple linear regression, the sample correlation coefficient is the square root of the $R^2$. But that's just for a simple (i.e., single variable) regression $Y=\beta_0+\beta_1X+\...

Suppose are three time series, $X_1$, $X_2$ and $Y$ Running ordinary linear regression on $Y$ ~ $X_1$ ($Y = b X_1 + b_0 + \epsilon$ ), we get $R^2 = U$. The ordinary linear regression $Y$ ~ $X_2$ get ...

I want to run a linear regression in SPSS N = 1400 Outcome variable = rating from 0 to 800 (participants saw or heard a Mandarin speaker and had to rate how pleasant the speaker was feeling) ...

A very basic question concerning the $R^2$ of OLS regressions run OLS regression y ~ x1, we have an $R^2$, say 0.3 run OLS regression y ~ x2, we have another $R^2$, say 0.4 now we run a regression y ~...

I am performing the multiple linear regression below in R to predict returns on fund managed. reg <- lm(formula=RET~GRI+SAT+MBA+AGE+TEN, data=rawdata)Here ...

I used multiple imputation on SPSS to deal with missing data in my study. I then carried out multiple regression from the imputed and original data-sets, using a split-file. I now have output for each ...

Say in case of a standard CLRM ( classical linear regression model ) we are aware that the population estimaotrs $\beta_1$ , $\beta_2$ etc. satisfy the following relation that $f(\beta_1$,$\beta_2$,$...

I am supposed to run a regression of Medical Expenditure on several explanatory variables like income, no of illnesses, age and also some dummy variables like gender differentiating dummy, dummy for ...

I did the following steps in my modeling using R: 1)applied preProcess(data, method = c("bagImpute")) function in CARET package and then encoded the data. 2)Used SMOTE to balance the data(because the ...

I have to calculate a multiple regression via different methods (without using software). regress X1 over X2 and then Y over the error and regress X2 over X1 and then Y over the error using matrix ...

I am trying to prove that in multivariate linear regression $MSE = (n-2)\sigma^2 $ Here is my approach: Under the usual notation, $$ Y = X\beta + \epsilon \\ $$ $$ \hat Y = X\hat\beta \\ $$ $$ ...

This is how my Training dataset look like i want to predict revenue of the Restraunt I have been told to use logistic Regression to predict the revenue of the restaurant how can i achieve this using ...

I have a dataset with a large number of observations. My question is regarding the combining of linear regression and another smoothing technique. For example, if I have thousands of observations, ...

I want to make sure that I can generally interpret model findings accurately. Is it fair to say that each log-odds associated with a predictor assumes that the others are held constant at 0? Making it ...

Related tags

Hot questions

Language

Popular Tags