Following the definition of United Nations Office for Disaster Risk Reduction (UNDRR), resilience is “the ability of a system, community or society exposed to hazards to resist, absorb, accommodate, adapt to, transform and recover from the effects of a hazard in a timely and efficient manner, including through the preservation and restoration of its essential basic structures and functions through risk management”2, either planned or unplanned. Resilience starts with the emergency preparedness, paves to risk and vulnerability assessment, comes up with local solutions and adaption strategies, and is demonstrated through monitoring and evaluation3. Researching about resilience can help communities break the barriers between disaster risk reduction and climate change adaptation to increase sustainability. Recent years, there emerge many efforts spending to build community resilience. Nevertheless, there is, still, a long path forward because of the complexity of urbanisation, the large scale of infrastructures, the changing of environment, the sheer volume of businesses, and the difficulty of effectively and efficiently involve all citizens. Our purpose in this on-going work is to leverage big data technologies for analysing existing approaches to resilience in Europe, overcoming their limitations, and coming up with resilient strategies and software instruments. Communities can utilise our outputs to anticipate threats promptly and to derive dynamic strategies for better decision-making (Kapucu et al. 2013), including, but not limited to, situational awareness strategies, funding assessing for prevention, and local knowledge diffusion, in both standard and challenging times. Our crucial aspect to the development of community resilience is to places citizens in the centre of all decision-making processes and their quality of life should be at the highest priority. To provide comprehensive resilient strategies, we define resilient indicators on different types of hazards from chronic stresses (e.g., environmental pollution, endemic violence, and high unemployment) to acute shocks (e.g., infectious disease outbreaks, natural disasters, and terrorist attacks) (Blades 2017). By performing an in-depth literature review and informative interviews with specialists, we propose a completed list of resilient indicators at the local level. We calculate and evaluate determined resilient indicators towards each community by collecting data from various sources (Akerkar 2013), including: • Social media: we collect information from social pages, social groups and dedicated profiles on popular social networking services (e.g., Facebook, Twitter, and Instagram). However, we need to select based on usage patterns dynamically (e.g., Italian people prefer to use Facebook and Instagram than Twitter). • Previous projects: we use open data repositories and other information sources available from relevant and appropriate projects (e.g., the EmerGent) to derive insights from related work. • The Sendai framework4: this framework contains seven goals and thirty-eight indicators to prevent new and substantial reduce existing disaster risks and losses in the years from 2015 to 2030. • EU initiatives: many existing legacy datasets are available to ingest and process relevant information (e.g., the DesInventar initiative). • Local knowledge: this source is necessary in acquiring valuable input from people with limited access to social media. It can be related datasets that may work as a trigger of a hazardous situation as well. We design the data structure and implement the inventory to contain all data mentioned above. The inventory is developed as a web-based service that can support end-users and researchers to retrieve our resilient knowledge repository and to customise for future evolution flexibly. To effortlessly develop and extend the inventory tool, we make use of and derive benefit from open source technologies such as generalpurpose programming languages, NoSQL databases, and distributed processing engines. To fill up the inventory, we use various application programming interfaces (APIs), which are: the social media harvesting, the social media interaction, the open data connectors, the query inventory, the legacy system ingestion, the sensor embedding, and the command and control dialogue APIs. All the data is analysed to extract resilience-related useful information. Due to the collection of large-scale data from multiple sources, our inventory may contain heterogeneous, irrelevant, and even inconsistent content and structure. We, therefore, need to apply cutting-edge information fusion technologies as a pre-processing step to guarantee the quality of data. After refining, we organise resilient indicators in a multi-dimensional matrix and apply computational models to analyse and evaluate the effect of indicators towards communities under two perspectives: • Statistical and prevention analysis: providing statistics together with suitable information to prevent critical conditions affecting the community resilience under multiple types of hazards. • Decision support and prescription analysis: providing suitable information to help end-users (e.g., first responders, local communities, citizens) to behave in the right actions and adopt right behaviour. Besides, we construct the inventory so that it can be deployed on a cloud-based system to enable the excellent accessibility, high performance, sporadic batch processing, and better scalability. We also provide a set of web-based user interfaces (UIs) for end-users. The UIs can enhance the ability of end-users in promptly achieving information and seamlessly interacting with the inventory and with the platform. We believe that this on-going work can effectively increase the understanding of resilience in communities and come up with innovative strategies for improving community resilience. [ABSTRACT FROM AUTHOR]