Upper limb prosthesis users currently lack haptic feedback from their terminal devices, which significantly limits their ability to meaningfully interact with their environment. Users therefore rely heavily on visual feedback when using terminal devices. Previously, it has been shown that force-related feedback from an end-effector or virtual environment can help the user minimize errors and improve performance. Currently, myoelectric control systems enable the user to control the velocity of terminal devices. We have developed a novel control method using ultrasound sensing, called sonomyography, that enables position control based on mechanical deformation of muscles. In this paper, we investigated whether the proprioceptive feedback from muscle deformation combined with vibrotactile haptic feedback can minimize the need for visual feedback. Able bodied subjects used sonomyography to control a virtual cursor, and performed a target acquisition task. The effect of visual and haptic feedback on performance of a target acquisition task was systematically tested. We found that subjects made large errors when they tried to reacquire a target without visual feedback, but in the presence of real-time haptic feedback, the precision of the target position improved, and were similar to when visual feedback was used for target acquisition. This result has implications for improving the performance of prosthetic control systems.