In the era of big data, mathematical optimization has experienced a paradigm shift due to the introduction of vast amounts of data in real-time. The need to solve linear optimization problems is central to most quantitative areas such as Scientific Computing, Machine Learning, Signal Processing, Operations Research, and Computer Vision. During the last decade, there has been a huge demand from practitioners for computational frameworks that can handle huge amounts of data. While most classical optimization methods are deterministic in nature, recent developments suggest that stochastic methods play a significant role in the design of efficient algorithms that are better than existing deterministic methods for solving optimization problems with huge dimensional data. The broader goal of this thesis is to design scalable stochastic algorithms for solving large-scale convex optimization problems. In this thesis, our main focus is on the design, analysis, and implementation of such algorithms. In particular, we are interested in developing stochastic iterative methods for solving large-scale linear systems, linear feasibility, and linear programming problems.