Our understanding of the novel properties of materials and particles with dimensions in the nanometer range is growing rapidly, and commercial nanomaterials are reaching the market place at an increasing rate (>1000 “nano” products available in the USA alone). However, we still understand very little about how nanomaterials interact with mammalian systems or the environment. The precautionary principle demands care in handling and distributing nanomaterials in the workplace, marketplace, or environment until we better understand their interactions with biological systems [1e3]. For example, titanium dioxide nanoparticles generate free radicals in light, a property exploited for water photolysis reactions and for developing self-cleaning surfaces. However, titanium dioxide and zinc oxide nanoparticles also are used in sunscreens as they absorb UVradiation and effectively prevent sunburn [4,5]. The potential for generating free radicals that would adversely affect skin cells is a cause for concern, particularly as sunscreen is mandated for childcare centers in some countries, such as Australia. A further example is high aspect ratio carbon nanotubes that have been shown to have detrimental health effects similar to asbestos [6,7]. There is also increasing general concern about the occupational health and safety and public health impacts of nanoparticles. Their size allows them to be inhaled, penetrate the skin, mucosa, or lung tissue, be transported around the body, and enter cells where they could interact with biomolecules and interfere with cellular biochemistry. We have grossly inadequate knowledge of how cells, organs, and tissues interact with nanoparticles, particularly for chronic exposure. There is, therefore, an urgent need to determine whether nanoparticles impact negatively on human and environmental health, and to develop robust and reliable methods for predicting nanotoxicity proactively, rather than reactively once a serious public health or environmental problem appears.
As experiments on biological effects of nanomaterials are absolutely necessary but time consuming and expensive, there is a strong need to develop cheaper, predictive methods for rapidly assessing potential toxicities of new and modified nanomaterials. Computational modeling techniques have a relatively long history of performance, and have obvious synergy with experimental toxicology, in assessing the risk of industrial chemicals, agrochemicals and drugs. Chemical regulatory agencies around the world are increasingly using computational tools, particularly statistical modeling and machine learning methods embodied in QSAR (quantitative structureeactivity relationships) modeling [8,9]. Recent reviews and commentaries in high impact journals [1e3] have called for computational methods to be applied to understanding nanotoxicology, and regulators are calling for these tools to be developed as a matter of urgency. As with industrial chemicals, high throughput methods can provide large volumes of data on the biological effects of nanomaterials, although experimental programs are just starting to embrace these technologies. Increased used of gene arrays and related technologies to assess more rapidly the impact of chemical and nanomaterials on biological systems should also be of major significance. Computational tools, and the necessary informatics and ontological tools that must accompany them, will be necessary to manage nanotoxicological data in the future. Hopefully large-scale scientific consortia will be formed (similar to ToxCAST  and the HPV challenge  with industrial chemicals) that will provide data for risk assessment and modeling. Other initiatives for data consolidation similar to ChEMBL, ACToR , data fusion, MIAME [13,14] for small molecules and gene arrays, and embracing new information technologies like FaceBook, RSS feeds will also be important for efficient data sharing between research groups and regulatory agencies.