Page 1 of 5 123 ... LastLast
Results 1 to 10 of 42

Thread: Could something like this be done or did I watch too many movies?

  1. #1
    Join Date
    Nov 2012
    Beans
    265
    Distro
    Ubuntu 13.04 Raring Ringtail

    Could something like this be done or did I watch too many movies?

    Hey

    Got an idea yesterday, that maybe some programmer could answer me best. I need thinking out of the box for a second.
    Maybe you will laugh, but what the heck, if this is not possible, at least I'll be a little smarter after the discussion or maybe start a good concept for a SF movie

    Anyway, my idea is about "downloading" (copying a file) and possibility to speed things up only using software, and not upgrading hardware or your internet connection.

    So, how this would work...

    As I understand, when you download a file, the download manager reads the file byte by byte and copies it to your HDD. Is that correct? If it is, read on...

    What if you didn't need to download the actual file? What if the download manager can make an exact local copy of the file you want to download only by getting some information from the Server about the file.

    How this would work:
    The Client sends the download request for a file to the Server. Then the Server analyses that file and says to the Client: Im not going to send you such a large file psychically, it will take too much time and bandwidth, instead Im gonna send you signals that will perfectly describe how this file looks like, and then you can use this information that I give you and make a local copy of the same file that exists here.

    So it's like this: don't send me a psychical file, send me a description of that file, and I will make it psychical here locally.

    I have no idea really how that would work, maybe the Server would say "ABC" and the client would know he has to write some information that code "ABC" represents.

    So instead of downloading 1 MB of some data, the server would send a code "ABC" (for example) and the client would know what that means, and make an exact copy of that 1 MB of data. So if sending the information "ABC" is quicker then downloading 1 MB of actual data, then you would "download" that 1 MB of data faster with the same internet connection, thus boosting the speed needed to copy something over the internet.

    Now this probably is not possible, but I want to know why?

    And if there is some kind of workaround to achieve something similar to this. I mean, do we really need to download something? Cant we just clone it locally based on information sent? Or would that get to a point of diminishing returns?

    So fiction or plausible?

  2. #2
    Join Date
    Jun 2011
    Location
    United Kingdom
    Beans
    Hidden!
    Distro
    Lubuntu Development Release

    Re: Could something like this be done or did I watch too many movies?

    Perfectly plausible, this is what happens (If I understand you correctly) with compression, i.e. downloading a zip file. Problem is, compression isn't that good, so "ABC" wouldn't work too well.

  3. #3
    Join Date
    May 2006
    Location
    Milwaukee,WI
    Beans
    6,280
    Distro
    Xubuntu 14.04 Trusty Tahr

    Re: Could something like this be done or did I watch too many movies?

    i don't think this would work for say a movie, if you wanted to download a movie, theres no amount of data the server could provide for the client to "make" that movie locally.

    cool idea however

  4. #4
    Join Date
    Jan 2007
    Beans
    6,537
    Distro
    Ubuntu 13.04 Raring Ringtail

    Re: Could something like this be done or did I watch too many movies?

    Yep, you've just described compression.

    Say you have a file containing the following data:

    AAABB BBCCC CCDEE (15 "bits")

    A compression algorithm might shorten that down to:

    3A 4B 5C D 2E (9 bits)

    The compressed file doesn't contain the original data, it contains "signals that will perfectly describe how this file looks like" as you put it.

    Compression is used to speed up connections. It's used on the server side somewhat, and the old Opera Mini browser for phones used to compress all traffic to try and make browsing on a nasty GPRS connection tolerable.

  5. #5
    Join Date
    Nov 2012
    Beans
    265
    Distro
    Ubuntu 13.04 Raring Ringtail

    Re: Could something like this be done or did I watch too many movies?

    Ok, so I invented ZIP, great '89 here I come.

    So basically what you guys are saying is, we are already achieving this by using compression, there is no other magical way to do this stuff better then what we are already doing?

  6. #6
    Join Date
    Nov 2006
    Location
    Belgium
    Beans
    3,025
    Distro
    Ubuntu 10.04 Lucid Lynx

    Re: Could something like this be done or did I watch too many movies?

    Quote Originally Posted by Thee View Post
    So basically what you guys are saying is, we are already achieving this by using compression, there is no other magical way to do this stuff better then what we are already doing?
    Well, 2 things
    1-
    "compression" is a broad generic term. There are many compression algorithms. It's not impssoble to come up with a new one

    2- hash/fingerprint functions, maybe ?
    Theoretically, each file produces a unique hash.
    So if the server sends you a hash of the file you requested, all you have to do is recreate a file that, given the function, creates the same hash. You then know that the file you created is identical to the one on the server.

    Unfortunately, creating the correct file may be difficult.

  7. #7
    Join Date
    Nov 2012
    Beans
    265
    Distro
    Ubuntu 13.04 Raring Ringtail

    Re: Could something like this be done or did I watch too many movies?

    Quote Originally Posted by koenn View Post
    Well, 2 things
    1-
    "compression" is a broad generic term. There are many compression algorithms. It's not impssoble to come up with a new one
    Yeah I know that, but its not possible in a way that I described, or it is possible but not practical enough.


    2- hash/fingerprint functions, maybe ?
    Theoretically, each file produces a unique hash.
    So if the server sends you a hash of the file you requested, all you have to do is recreate a file that, given the function, creates the same hash. You then know that the file you created is identical to the one on the server.

    Unfortunately, creating the correct file may be difficult.
    Not really sure how that works, but it sounds interesting.
    Is there any "proof of concept" on a small scale for those functions used in the matter I described?

  8. #8
    Join Date
    Jun 2011
    Location
    United Kingdom
    Beans
    Hidden!
    Distro
    Lubuntu Development Release

    Re: Could something like this be done or did I watch too many movies?

    Quote Originally Posted by Thee View Post
    Not really sure how that works, but it sounds interesting.
    Is there any "proof of concept" on a small scale for those functions used in the matter I described?
    "One-way" compression is often a good way to check file integrity, i.e. MD5sums.

    You can generate a unique hash from a file like this:

    Code:
    md5sum <file>
    ...but try recreating that file and it's (hypothetically at least) a one-way process without a lot of time and computing power.
    Last edited by MG&TL; January 12th, 2013 at 12:19 AM.

  9. #9
    Join Date
    Jan 2008
    Location
    Lausanne, Switzerland
    Beans
    341
    Distro
    Ubuntu 13.04 Raring Ringtail

    Re: Could something like this be done or did I watch too many movies?

    You can have a look at fractal compression. The benefits are very large compression factors, unfortunately, the time to compress a file is extremely long.

  10. #10
    Join Date
    Nov 2006
    Location
    Belgium
    Beans
    3,025
    Distro
    Ubuntu 10.04 Lucid Lynx

    Re: Could something like this be done or did I watch too many movies?

    Quote Originally Posted by MG&TL View Post
    ... without a lot of time and computing power.
    yeah, in a worst case scenario you'd have to brute-force it, ie start from an empty file and create all possible files until one checks out.


    otoh, iirc, rsync uses a sort of "reuse parts of existing files on the destination system to reduce data transfer" approach. It's mostly used to "sync" two copies of supposedly the same file, but I think such algorithm could also be applied between unrelated files. Like "rsync" (remote) picture A against (local) picture B, to convert picture B into (local) picture A.
    The problem then becomes : how to find a suitable file.

    rsync --fuzzy seems to be going in that direction



    And in the end it's always a trade off between bandwidth and processing, and their relative costs.

Page 1 of 5 123 ... LastLast

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •