We present an approach to motion planning for humanoid robots that aims to ensure reliable execution by augmenting the planning process to reason about the robot's ability to successfully perceive its environment during operation. By efficiently simulating the robot's perception system during search, our planner utilizes a perceptive capability metric that quantifies the 'sensability' of the environment in each state given the task to be accomplished. We have applied our method to the problem of planning robust autonomous grasping motions and walking sequences as performed by an HRP-2 humanoid. A fast GPU-accelerated 3D tracker is used for perception, with a grasp planner and footstep planner incorporating reasoning about the robot's perceptive capability. Experimental results show that considering information about the predicted perceptive capability ensures that sensing remains operational throughout the grasping or walking sequence and yields higher task success rates than perception-unaware planning.