Welcome to Ubuntu Forums nickeforos !
By looking at the man page of wget, it seems there is no internal option in it to do what you want, and so you must loop the wget command on the URLs individually.
For example, assuming your URL list file contains one URL per line, you can run the following loop -
The above code should merge each URL's corresponding data in "file-<the URL's line no. in the list file>", (e.g. file-1, file-2, file-3... etc.). If you need a more meaningful name for each file, a relevant function to do that can be included in the loop (along with a function to make sure no files are overwritten in case of duplicate names).
count=1; while read URL; do wget --recursive --convert-links --page-requisites --html-extension --restrict-file-names=windows --no-clobber -O file-$count "$URL"; let count=count+1; done < list.txt