This isn't a programming specific question, but a math problem.
If the mouse is in the center of the screen, each side expands equally. If the mouse is all the way to one side, the rectangle expands only in that direction. So you need to determine the size increase from a ratio.
So, we need to setup a few variables here: width_delta, height_delta, mouse_x_relative, mouse_y_relative, rect_width, rect_height.
- Width delta is the new width minus the old width.
- Height delta is the new height minus the old height.
- Mouse x relative is the x-coordinate relative to the rect's left side
- Mouse y relative is the y-coordinate relative to the rect's top side
With each delta, when the mouse is perfectly centered, we can calculate the change in the rectangles sides with delta - delta / 2, and delta / 2. This results in half of the delta going to the one side, the other half to the other. So instead of dividing by 2, we need to find out how this relates to the mouse position and the size of the rect.
Easy enough: rect_width / mouse_x_relative. Let's say a rect is width 10, mouse is in the center at 5, 10 / 5 is 2, causing the delta to be distributed equally to both sides of the rect. So we need to divide the delta by rect_width / mouse_x_relative.
left_delta = width_delta / rect_width / mouse_x_relative
right_delta = width_delta - left_delta
But we can clean that up to be:
left_delta = width_delta * mouse_x_relative / rect_width
right_delta = width_delta - left_delta
I believe that should work with your expected behavior, unlike my last answer. When you zoom in (shrink) the rect closes in on the mouse (centering), when you zoom out it moves away from the mouse (un-centering, if you will), the inverse of the way it moved in.