I do know Ohm's Law, and I have known it since the electrical courses I had taken it back in High School, the two years I went through in Electrical Technology college courses, the nine months of it I went through it in the Navy ET "Class A" School in the Great Lakes, the "Class C" schools I also took, and the 3 years of active service as an ET aboard ship. Then those 30 some years as a computer tester, designer, programmer and writer at Xerox where it was also useful to know about electrical basics and the advanced academics that I had too.
Current is not forced into a device, current is the result of a voltage applied across a resistance. If you have (for example) a 19v source applied across a 10 ohm load the result will be a (19/10) 1.9 amp drain of the source - AS LONG AS the power source can supply that much current. The OP's second power supply is rated at just 3.19A, whereas the original was rated at 3.42A, does not mean that 3.42A was ever drawn from the original power supply. Only that it was rated to reliability deliver that much, if needed.
And just because a power supply is rated at 3.19A does not mean it can't deliver 3.20, or 3.21 or even 3.42. As you get beyond the rated limits of a power supply what generally happens is that more ripple (less steady) may be riding on the DC voltage. Few power supplies have a cutoff point so tightly tied to it rated specs.
More than likely, once the computer is up and running, past the initial surges, the actual requirements of the power supply will be much less than those ratings. So, again I say, if the plug fits, and the voltages are so very close, and the current capability is about there or better, just try it. I'll bet my over 40 years of experience that nothing bad happens.