The technical project details the creation of a graphical hair simulation through an exploration of existing fluid and hair simulations. The science, technology, and society (STS) research section seeks to provide insight into the racial biases built into animation and graphics technology, the historical context that these biases are rooted in, and their future implications. Closer inspection of current graphics literature reveals that patterns of racial bias are found embedded in the discipline, from sample images presented in publications to the technical language itself. These biases are especially visible in research surrounding the animation, simulation, and rendering of humans and human features that can be characteristic of certain races, such as hair, which is the focus of the technical portion of this thesis. As such, it is crucial to approach even seemingly purely technical problems with an awareness of the implicit biases that are endemic to the field. The complexity of simulating human hair movement can be attributed to its structural complexity, which involves the interactions between hundreds of thousands of thin strands. One of the many issues that needs to be resolved in order to simulate hair is how to efficiently handle collisions between strands, as naïve approaches such as checking each segment of each strand for collisions with other strands at every time step are often highly impractical. A fluid-based method for simulating hair, comprising the embedding of strands into a particle-based fluid simulation in order to implicitly resolve internal collisions, was explored. Several different fluid simulation approaches and collision resolution strategies were also investigated. Technology carries with it the values and biases of the society it was created in. In this paper, we discuss this phenomenon in the context of computer graphics technology and racial diversity to answer the following research questions: What racial biases are present in computer graphics and animation technology and the media created with this technology, and how have these biases been sustained and molded by technology? What are their implications on the stories told through computer generated media? This paper expands on previous studies addressing the lack of representation of racially diverse features in computer graphics research and the implications of this lack by means of documentary research and discourse analysis. This paper employs the STS framework of co-production in the analysis of the relationship between graphics technology and historical and social values, with a focus on how this relationship forms and upholds identities, institutions, discourse, and representation. Evidence concerning the presence of the issue is found through surveying prominent research papers in the graphics field, existing documentation and software add-ons for popular animation tools, and open-source software packages meant for hobbyist creators. By tracking the development of broader media technology, as well as analyzing the quantity and quality of diversity in film over time both in front and behind the camera, a narrative of the historical context of these biases is built. Finally, we discuss the social implications of our findings in terms of what stories are ultimately told through the animated medium and who is given the ability to tell those stories. Working on both of these projects in tandem has revealed how racial biases have been perpetuated in graphics and animation media and technology. The research performed to complete the technical project provided critical background knowledge for an informed survey of graphics literature in the STS research project. Additionally, the technical challenges unearthed while implementing a hair simulation for a relatively non-complex hair type (straight strands) highlighted the necessity for focused research for other hair types, as the long-held assumption that simulations that work for one type will deliver similar quality results for other types can easily be incorrect. Such faulty assumptions lead to the simulation of some elements being given greater priority while others are pushed to the sidelines. When these elements are human features characteristic to different races, the result is research that contains certain biases which then causes these biases to be further disseminated into technological tools that reach the broader public through popular media. In order for inclusive technology and storytelling, it is essential for researchers, developers, and users alike to become better informed of the biases present in their tools and work, and to include members of diverse social groups when creating narratives of those social groups.