Centres Of Excellence

To focus on new and emerging areas of research and education, Centres of Excellence have been established within the Institute. These ‘virtual' centres draw on resources from its stakeholders, and interact with them to enhance core competencies

Read More >>

Faculty

Faculty members at IIMB generate knowledge through cutting-edge research in all functional areas of management that would benefit public and private sector companies, and government and society in general.

Read More >>

IIMB Management Review

Journal of Indian Institute of Management Bangalore

IIM Bangalore offers Degree-Granting Programmes, a Diploma Programme, Certificate Programmes and Executive Education Programmes and specialised courses in areas such as entrepreneurship and public policy.

Read More >>

About IIMB

The Indian Institute of Management Bangalore (IIMB) believes in building leaders through holistic, transformative and innovative education

Read More >>

Research & Publications Office to host seminar on ‘Near Optimal Heteroscedastic Regression with Symbiotic Learning’ on 5th September

The talk will be delivered by Dr. Praneeth Netrapalli, Research Scientist at Google Research India

22 August, 2023, Bengaluru: The Office of Research and Publications (R&P) at IIM Bangalore will host a research seminar on ‘Near Optimal Heteroscedastic Regression with Symbiotic Learning’, to be led by Dr. Praneeth Netrapalli, Research Scientist at Google Research India (Decision Sciences area), at 3.30 pm on 5th September 2023, at Classroom M-21. 

Abstract: The research deals with the classical problem of heteroscedastic linear regression where, given n independent and identically distributed samples $(x_i,y_i)$ drawn from the model $y_i=w^T x_i + \eps_i⋅(f^T x_i)$ where $x_i ~ N(0,I)$ and $\epsilon_i ~ N(0,1)$, the aim is to estimate the regressor `w` without prior knowledge of the noise parameter `f`. In addition to classical applications of such models in statistics (Jobson and Fuller, 1980), econometrics (Harvey, 1976), time series analysis (Engle, 1982) etc., it is also particularly relevant in machine learning problems where data is collected from multiple sources of varying (but apriori unknown) quality, for example, in the training of large models (Devlin et al., 2019) on web-scale data. In this work, the researchers develop an algorithm called SymbLearn (short for Symbiotic Learning) which estimates `w` in squared norm up to an error of $\tilde O(||f||^2(1/n+(d/n)^2))$, and prove that this rate is minimax optimal modulo logarithmic factors. This represents a substantial improvement upon the previous best known upper bound of \Otilde(||f||^2 d/n). The algorithm is essentially an alternating minimization procedure which comprises two key subroutines: (1) an adaptation of the classical weighted least squares heuristic to estimate `w` (dating back to at least (Davidian and Carroll 1987)), for which the work presents the first non-asymptotic guarantee; (2) a novel non-convex pseudo gradient descent procedure for estimating `f`, which draws inspiration from the phase retrieval literature. As corollaries of the analysis, the researchers obtain fast non-asymptotic rates for two important problems, linear regression with multiplicative noise, and phase retrieval with multiplicative noise, both of which could be of independent interest. Beyond this, the proof of the lower bound, which involves a novel adaptation of Le Cam’s two point method for handling infinite mutual information quantities (which prevents a direct application of standard techniques such as Fano’s method), could also be of broader interest for establishing lower bounds for other heteroscedastic or heavy tailed statistical problems.

Speaker Profile: Dr. Praneeth Netrapalli is a research scientist at Google Research India, Bengaluru. He is also an Adjunct Professor at IIT Bombay and TIFR, Mumbai, and a Faculty Associate of ICTS, Bengaluru. Prior to this, he was a researcher at Microsoft Research. He obtained his MS and PhD in ECE from UT Austin, and B Tech in Electrical Engineering from IIT Bombay. He is a co-recipient of IEEE Signal Processing Society Best Paper Award 2019, Indian National Science Academy (INSA) Medal for Young Scientists 2021 and is an associate of Indian Academy of Sciences (IASc) 2019-2022. His current research interest is to make training and inference of large language models more efficient.

Webpage Link: https://praneethnetrapalli.org/

Add to Calendar 2023-09-05 05:30:00 2024-05-09 02:53:10 Research & Publications Office to host seminar on ‘Near Optimal Heteroscedastic Regression with Symbiotic Learning’ on 5th September The talk will be delivered by Dr. Praneeth Netrapalli, Research Scientist at Google Research India 22 August, 2023, Bengaluru: The Office of Research and Publications (R&P) at IIM Bangalore will host a research seminar on ‘Near Optimal Heteroscedastic Regression with Symbiotic Learning’, to be led by Dr. Praneeth Netrapalli, Research Scientist at Google Research India (Decision Sciences area), at 3.30 pm on 5th September 2023, at Classroom M-21.  Abstract: The research deals with the classical problem of heteroscedastic linear regression where, given n independent and identically distributed samples $(x_i,y_i)$ drawn from the model $y_i=w^T x_i + \eps_i⋅(f^T x_i)$ where $x_i ~ N(0,I)$ and $\epsilon_i ~ N(0,1)$, the aim is to estimate the regressor `w` without prior knowledge of the noise parameter `f`. In addition to classical applications of such models in statistics (Jobson and Fuller, 1980), econometrics (Harvey, 1976), time series analysis (Engle, 1982) etc., it is also particularly relevant in machine learning problems where data is collected from multiple sources of varying (but apriori unknown) quality, for example, in the training of large models (Devlin et al., 2019) on web-scale data. In this work, the researchers develop an algorithm called SymbLearn (short for Symbiotic Learning) which estimates `w` in squared norm up to an error of $\tilde O(||f||^2(1/n+(d/n)^2))$, and prove that this rate is minimax optimal modulo logarithmic factors. This represents a substantial improvement upon the previous best known upper bound of \Otilde(||f||^2 d/n). The algorithm is essentially an alternating minimization procedure which comprises two key subroutines: (1) an adaptation of the classical weighted least squares heuristic to estimate `w` (dating back to at least (Davidian and Carroll 1987)), for which the work presents the first non-asymptotic guarantee; (2) a novel non-convex pseudo gradient descent procedure for estimating `f`, which draws inspiration from the phase retrieval literature. As corollaries of the analysis, the researchers obtain fast non-asymptotic rates for two important problems, linear regression with multiplicative noise, and phase retrieval with multiplicative noise, both of which could be of independent interest. Beyond this, the proof of the lower bound, which involves a novel adaptation of Le Cam’s two point method for handling infinite mutual information quantities (which prevents a direct application of standard techniques such as Fano’s method), could also be of broader interest for establishing lower bounds for other heteroscedastic or heavy tailed statistical problems. Speaker Profile: Dr. Praneeth Netrapalli is a research scientist at Google Research India, Bengaluru. He is also an Adjunct Professor at IIT Bombay and TIFR, Mumbai, and a Faculty Associate of ICTS, Bengaluru. Prior to this, he was a researcher at Microsoft Research. He obtained his MS and PhD in ECE from UT Austin, and B Tech in Electrical Engineering from IIT Bombay. He is a co-recipient of IEEE Signal Processing Society Best Paper Award 2019, Indian National Science Academy (INSA) Medal for Young Scientists 2021 and is an associate of Indian Academy of Sciences (IASc) 2019-2022. His current research interest is to make training and inference of large language models more efficient. Webpage Link: https://praneethnetrapalli.org/ IIM Bangalore IIM Bangalore communications@iimb.ac.in Asia/Kolkata public
Add to Calendar 2023-09-05 05:30:00 2024-05-09 02:53:10 Research & Publications Office to host seminar on ‘Near Optimal Heteroscedastic Regression with Symbiotic Learning’ on 5th September The talk will be delivered by Dr. Praneeth Netrapalli, Research Scientist at Google Research India 22 August, 2023, Bengaluru: The Office of Research and Publications (R&P) at IIM Bangalore will host a research seminar on ‘Near Optimal Heteroscedastic Regression with Symbiotic Learning’, to be led by Dr. Praneeth Netrapalli, Research Scientist at Google Research India (Decision Sciences area), at 3.30 pm on 5th September 2023, at Classroom M-21.  Abstract: The research deals with the classical problem of heteroscedastic linear regression where, given n independent and identically distributed samples $(x_i,y_i)$ drawn from the model $y_i=w^T x_i + \eps_i⋅(f^T x_i)$ where $x_i ~ N(0,I)$ and $\epsilon_i ~ N(0,1)$, the aim is to estimate the regressor `w` without prior knowledge of the noise parameter `f`. In addition to classical applications of such models in statistics (Jobson and Fuller, 1980), econometrics (Harvey, 1976), time series analysis (Engle, 1982) etc., it is also particularly relevant in machine learning problems where data is collected from multiple sources of varying (but apriori unknown) quality, for example, in the training of large models (Devlin et al., 2019) on web-scale data. In this work, the researchers develop an algorithm called SymbLearn (short for Symbiotic Learning) which estimates `w` in squared norm up to an error of $\tilde O(||f||^2(1/n+(d/n)^2))$, and prove that this rate is minimax optimal modulo logarithmic factors. This represents a substantial improvement upon the previous best known upper bound of \Otilde(||f||^2 d/n). The algorithm is essentially an alternating minimization procedure which comprises two key subroutines: (1) an adaptation of the classical weighted least squares heuristic to estimate `w` (dating back to at least (Davidian and Carroll 1987)), for which the work presents the first non-asymptotic guarantee; (2) a novel non-convex pseudo gradient descent procedure for estimating `f`, which draws inspiration from the phase retrieval literature. As corollaries of the analysis, the researchers obtain fast non-asymptotic rates for two important problems, linear regression with multiplicative noise, and phase retrieval with multiplicative noise, both of which could be of independent interest. Beyond this, the proof of the lower bound, which involves a novel adaptation of Le Cam’s two point method for handling infinite mutual information quantities (which prevents a direct application of standard techniques such as Fano’s method), could also be of broader interest for establishing lower bounds for other heteroscedastic or heavy tailed statistical problems. Speaker Profile: Dr. Praneeth Netrapalli is a research scientist at Google Research India, Bengaluru. He is also an Adjunct Professor at IIT Bombay and TIFR, Mumbai, and a Faculty Associate of ICTS, Bengaluru. Prior to this, he was a researcher at Microsoft Research. He obtained his MS and PhD in ECE from UT Austin, and B Tech in Electrical Engineering from IIT Bombay. He is a co-recipient of IEEE Signal Processing Society Best Paper Award 2019, Indian National Science Academy (INSA) Medal for Young Scientists 2021 and is an associate of Indian Academy of Sciences (IASc) 2019-2022. His current research interest is to make training and inference of large language models more efficient. Webpage Link: https://praneethnetrapalli.org/ IIM Bangalore IIM Bangalore communications@iimb.ac.in Asia/Kolkata public

The talk will be delivered by Dr. Praneeth Netrapalli, Research Scientist at Google Research India

22 August, 2023, Bengaluru: The Office of Research and Publications (R&P) at IIM Bangalore will host a research seminar on ‘Near Optimal Heteroscedastic Regression with Symbiotic Learning’, to be led by Dr. Praneeth Netrapalli, Research Scientist at Google Research India (Decision Sciences area), at 3.30 pm on 5th September 2023, at Classroom M-21. 

Abstract: The research deals with the classical problem of heteroscedastic linear regression where, given n independent and identically distributed samples $(x_i,y_i)$ drawn from the model $y_i=w^T x_i + \eps_i⋅(f^T x_i)$ where $x_i ~ N(0,I)$ and $\epsilon_i ~ N(0,1)$, the aim is to estimate the regressor `w` without prior knowledge of the noise parameter `f`. In addition to classical applications of such models in statistics (Jobson and Fuller, 1980), econometrics (Harvey, 1976), time series analysis (Engle, 1982) etc., it is also particularly relevant in machine learning problems where data is collected from multiple sources of varying (but apriori unknown) quality, for example, in the training of large models (Devlin et al., 2019) on web-scale data. In this work, the researchers develop an algorithm called SymbLearn (short for Symbiotic Learning) which estimates `w` in squared norm up to an error of $\tilde O(||f||^2(1/n+(d/n)^2))$, and prove that this rate is minimax optimal modulo logarithmic factors. This represents a substantial improvement upon the previous best known upper bound of \Otilde(||f||^2 d/n). The algorithm is essentially an alternating minimization procedure which comprises two key subroutines: (1) an adaptation of the classical weighted least squares heuristic to estimate `w` (dating back to at least (Davidian and Carroll 1987)), for which the work presents the first non-asymptotic guarantee; (2) a novel non-convex pseudo gradient descent procedure for estimating `f`, which draws inspiration from the phase retrieval literature. As corollaries of the analysis, the researchers obtain fast non-asymptotic rates for two important problems, linear regression with multiplicative noise, and phase retrieval with multiplicative noise, both of which could be of independent interest. Beyond this, the proof of the lower bound, which involves a novel adaptation of Le Cam’s two point method for handling infinite mutual information quantities (which prevents a direct application of standard techniques such as Fano’s method), could also be of broader interest for establishing lower bounds for other heteroscedastic or heavy tailed statistical problems.

Speaker Profile: Dr. Praneeth Netrapalli is a research scientist at Google Research India, Bengaluru. He is also an Adjunct Professor at IIT Bombay and TIFR, Mumbai, and a Faculty Associate of ICTS, Bengaluru. Prior to this, he was a researcher at Microsoft Research. He obtained his MS and PhD in ECE from UT Austin, and B Tech in Electrical Engineering from IIT Bombay. He is a co-recipient of IEEE Signal Processing Society Best Paper Award 2019, Indian National Science Academy (INSA) Medal for Young Scientists 2021 and is an associate of Indian Academy of Sciences (IASc) 2019-2022. His current research interest is to make training and inference of large language models more efficient.

Webpage Link: https://praneethnetrapalli.org/