Techniques currently used by turf managers to schedule irrigations promote overwatering, causing the inefficient use of water resources. The plant canopy temperature‐ambient air temperature differential is a good indicator of the water status of a plant. Field experiments were conducted to assess the potential of using plant canopy temperature, measured with an infrared thermometer, to schedule irrigations for Kentucky bluegrass (Poa pratensisL.) turf and to develop preliminary information to use stress degree day (SDD), crop water stress index (CWSI), and critical point model (CPM) indices to schedule irrigations. Data were collected in the summer and fall of 1983 from differentially irrigated plots. Treatments were: (i) well watered—irrigation at soil water potential of −0.04 MPa; (ii) slightly stressed—at soil water potential of −0.07 MPa; and (iii) moderately stressed—at soil water potential of −0.40 MPa. Variables were measured daily and included canopy temperature, ambient air temperature, solar radiation, vapor pressure deficit, open pan evaporation, wind speed, soil water potential, volumetric water content, number of days after irrigation, and the number of days after mowing. The data were used to develop the irrigation scheduling indices that were evaluated in 1984. Each of the indices was compared to tensiometer based irrigation scheduling at −0.07 MPa soil water potential. During a 25‐day period of hot, dry weather in 1984, water use and number of irrigation events (in parenthesis) were 98 (7), 112 (8), 140 (10), and 210 mm (15 times), respectively, for irrigation scheduling by tensiometer, SDDpos, CWSI, and CPM. Shoot density, verdure, and root weight were not significantly different for the treatments, but visual quality was higher for the CPM and CWSI treatments—reflecting the greater amount of water applied. Further refinement of these indices could allow them to be useful tools for irrigation scheduling.