The Internet of Things (IoT) introduces a novel dimension to the world of information and communication technology where connectivity is available anytime, anywhere for anything. To make this into reality, a large number of battery-powered smart things, such as sensors and actuators, need to be connected to the Internet in an energy efficient manner. The newly released IEEE 802.11ah Wi-Fi standard, marketed as Wi-Fi HaLow, is considered as a very promising technology for the IoT. One of the new features of IEEE 802.11ah, named Restricted Access Window (RAW), aims to increase efficiency in face of a large scale of densely deployed, energy constrained stations. It divides stations into groups, limiting simultaneous channel access to one group, therefore reducing the collision probability and increase scalability in IoT networks. The IEEE 802.11ah standard, however, does not specify how to configure the actual RAW grouping parameters. Existing research has shown that the optimal RAW configuration depends on a variety of network-related parameters, such as the number of stations, traffic patterns, and network load. Incorrect configuration severely impacts throughput, latency and energy efficiency. Moreover, network conditions may change over time due to a variety of reasons. Therefore, this thesis aims to dynamically optimize RAW configurations in real time to adapt to the current network conditions. In order to accomplish this goal, there should be five approaches involved, including parameters evaluation, optimization, modeling, network condition estimation and implementation. (1) With respect to evaluation, an in-depth analysis needs to be conducted, in order to gain insight into the impact of network-related parameters on optimal RAW configuration. (2) Optimization takes a RAW performance model and network conditions as input, and generates RAW configurations as output, according to the pursued performance metrics. (3) Therefore, a model is required to be able to predict RAW performance for a given set of parameters under specific network and traffic conditions. (4) Moreover, it is indispensible to estimate network conditions since they are unknown and even dynamic. (5) Last but not least, as hardware for IEEE 802.11ah is not on the market yet, a realistic simulator is vital for accomplishing the parameters evaluation, and evaluating the effectiveness of the other three approaches. As IEEE 802.11ah targets large scale IoT networks, this thesis focuses on scenarios where each sensor transmits packets with a certain (predictable) frequency and may change the transmission frequency over time. Firstly, the implementation of IEEE 802.11ah is detailed, along with experimental results to validate it. The simulator consists of physical layer models an implementation of the MAC features, including not only RAW but also fast association, an energy state model, adaptive modulation and coding schemes (MCSs), and Traffic Indication Map (TIM) segmentation. Subsequently, the RAW performance is evaluated, using the implemented IEEE 802.11ah simulator. The simulation shows that, with appropriate grouping, the RAW mechanism substantially improves throughput, latency and energy efficiency. Furthermore, the results suggest that the optimal grouping strategy depends on many parameters, and intelligent RAW group adaptation is necessary to maximize performance under dynamic conditions. The results provide a major leap towards such a strategy. Moreover, the Traffic-Aware RAW Optimization Algorithm (TAROA) is proposed to adapt the RAW parameters in real time based on the current traffic conditions. TAROA introduces a traffic estimation method to predict the packet transmission interval of each station only based on packet transmission information obtained by the AP during the last beacon interval. By using the simulation results under saturated state as an alternative to the RAW model, TAROA derives the optimal number of stations to assign to a group according to the estimated traffic conditions, in order to maximize the throughput. In addition, an improved version of traffic estimation is proposed and integrated into an enhanced version of TAROA, referred to as Enhanced Traffic-Aware RAW Optimization Algorithm (E-TAROA). Both TAROA and E-TAROA supports homogeneous stations, i.e., all stations use the same MCS and packet size. A further step is made by applying surrogate modelling in this research. A surrogate model is an efficient mathematical representation of a black box system, it is based on supervised learning (e.g., Kriging, or neural networks), and can be accurately trained with very a few labeled sample data points. This research integrates the surrogate modeling toolbox into the implemented IEEE 802.11ah simulator, and trains models for estimating performance under a wide range of network and traffic conditions. Based on the trained models, the Model-Based RAW Optimization Algorithm (MoROA) is proposed. MoROA inherits the traffic estimation method of TAROA, using the trained model to determine the optimal RAW configuration in real time through multi-objective optimization. As a single trained model only supports homogeneous stations, MoROA supports heterogeneous networks in terms of MCS and packet size, by introducing multiple RAW groups and assigning all homogeneous stations into a single RAW group. Finally, an advanced surrogate model is presented that can predict performance of heterogeneous stations. As heterogeneous networks have more parameters leading to an enormous design space, the training methodology is well designed, in order to speed up the training process and maintains relative high model accuracy. In summary, the research develops solutions for real-time RAW optimization to support large scale IoT networks. An open-source IEEE 802.11ah simulator, a patent, multiple journal and conference papers have been produced as a result.