Video streaming has become increasingly popular with commercial video streaming applications such as YouTube accounting for a large quantity of Internet traffic. While streaming video is sensitive to bandwidth jitter, a receiver buffer can ameliorate the effects of jitter by adjusting to the difference between the transmission rate and the playback rate. Unfortunately, there are few studies to determine the best size of the receiver buffer for TCP streaming. In this work, we investigate how the buffer size of video streaming applications changes with respect to variation in bandwidth. We model the video streaming system over TCP using simulation to develop our buffering algorithm. We propose using a dynamic client buffer size based on measured bandwidth variation to achieve fewer interruptions in video streaming playback.