Results from International examinations like TIMSS that have test items that are not free from item bias are causing misleading decisions at policy level as policy makers review educational policies, curriculum, teaching methods and assessment methods. Such uses or decisions are valid only to the extent that the items are bias free. The purpose of this study was therefore to investigate differential item functoning (DIF) in the test items for the 2007 TIMSS and its impact on the performance of students in Botswana, Singapore and the United States of America.The study followed guidelines of descriptive and exploratory research with a quantitative approach used for collecting data. Data was downloaded from the TIMSS website after getting permission from Boston. Ethical considerations were observed. A comparative analysis of data was done using two statistical methods: The Scheuneman Modified Chi-Square Analysis and Mantel Hanzel Analysis. The findings of this study reflected that the TIMSS items significantly functioned differently among students from Botswana, Singapore and the United States of America and therefore showing DIF across the nations. Few items flagged DIF across gender