When using an SGD optimizer, if you set the momentum hyperparameter too near to one (e.g., 0.99999), the algorithm may fluctuate around the optimal result. This is because the momentum term will lead the algorithm to continue moving in the same direction even though it is no longer moving towards its optimal result. This can cause the algorithm to take longer to converge and may also result in overfitting.