Newborn blood-spot screening to detect potentially treatable disorders is widely practiced across the globe. However, there are great variations in practice, both in terms of disorders covered, screening technologies, disease definition, information provision, parental informed consent, and storage and disposal of residual specimens, partly reflecting the degree to which screening is the subject of explicit legislation (and thus public and media pressure) or is embedded in a general health care system and managed at an executive level. It is generally accepted that disorders to be screened for should comply with the ten Wilson and Jungner criteria, but the way that compliance is assessed ranges from broadly-based opinion surveys to detailed analysis of quantitative data. Consequently, even countries with comparable levels of economic development and health care show large differences in the number of disorders screened for. There are several areas on which there are no generally accepted guidelines: how should parents be informed about screening and to what extent should they be encouraged to regard screening as an option to choose to refuse? Is DNA mutation analysis acceptable as part of a screening protocol? How soon should the blood samples be destroyed once screening has been completed? As technology advances and the potential scope of screening expands at both the metabolite and genome level, challenging policy issues will have to be faced.