I suggest you ...

to Create a Major Legal application of freenet!

Because my Idia to Create a Backup System on Freenet was declined, i reinsert my primary idia in an more General Form to open a dicussion about how to do. Unfortunaty the statiscalli most criminal part of the population in my country, the politicals, trys to get control over the internet.
As vehicle they use things like gambling, (full prohibited since 2009) or the child porn dicussion. Instead of create some law to create for example full body scan in addition to the DNA scan of sexual criminals, the try to establish some working zensoring system within the internet.
Because "hacking tools" are also still criminalized (only to own this kinf of programm, also to check own installations) is illigal, it is to expect that tools like freenet will come also under prohibition like the situation in china. There must be established a strong, hard to beat fact agiast prohibition like the additional power consumption of hundert of millions data backup devices in case of a working backup system like apples time Mashine!
Look at the situation in Myranmar. First option of the goverment was, to inhibit the internet access acoss the border, to get space for repressive actions against critic opinions.
Only Noth Corean is willing to accept the buissenes malus to prohibit internet as whole - nearly all other contrys are trying to censoring more or less sucenssfull the Internet to get decision power over computer screens. The improvement of CPU power helps this countrys. This must be quit more impossible in the future. To do so, one powerfull application must locked by a big buiseness benefit. At the moment freenet is extrem voluntary against prohibition, if it becomes more popular.

143 votes
Vote
Sign in
Check!
(thinking…)
Reset
or sign in with
  • facebook
  • google
    Password icon
    I agree to the terms of service
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    anonymousanonymous shared this idea  ·   ·  Admin →

    43 comments

    Sign in
    Check!
    (thinking…)
    Reset
    or sign in with
    • facebook
    • google
      Password icon
      I agree to the terms of service
      Signed in as (Sign out)
      Submitting...
      • enigmaenigma commented  · 

        The Idea of Distributed Proxy from Arne Babenhauserheide is good to create copyrigthresistence, but has no benefit to the user. It make in nearly every case the Internet access dramatically slower. Freenet needs a lot of Peer to Peer connections where direct hypertext transport protocol needs only one Peer to Peer access with one round trip - in case of no connection there are two round trips. You cant beat this with freenet. freenet will always be more slower than direct access. The only exception i see maybe in regions with poor backbone access like Africa. But it is question of time and may be solved just in the past. With the success of Smart phones the demand of Bandwidth is growing in every region and improve the Backbone. One single glas cable can change the conditions to switch freenet to be the slow option.

      • enigmaenigma commented  · 

        I am glad to see that my idea have found a lot of success.

        Arne Babenhauserheide idia to use freenet as content delivery network is also good to create huge amount of legal traffic. But there are a lot of Problems:
        You need one (or more) mirrors who are mirroring the entire Distribution to freenet.
        You need to configurer your installation to work with f-proxy.
        This is not so easy, the huge amount of people are using preincludet options of the Distribution.
        This means: You must include the freenet support to the general network installer of a Distribution or, if this not possible, you need to create your own derivative distribution.
        This includes a lot of Maintenance to do so over time.
        If the support did not continue over time, it is a big live time wasting of a bigger number of people, who must reconfigure to normal distribution path.

        Also i have sugested the backup utility, because this gives every content a legal presence within freenet. I dont know Country where Backups are not legal, because Harddiscs are dying everywhere. The content mafia is one of the biggest lobby against the freedom of Internet . They are using every change to create options to law control Internet. To make freenet "copyrightresistence" is most important to support it over time.

      • AnonymousAnonymous commented  · 

        The major benefit of the Cloud Aps is, that people do not need addition backup devices. Also the maintenance of the backup system consume a lot of time you can avoid using the cloud. Problem: All the Cloud Storage's are "CC: NSA", "CC: GCHQ", ... which kills the Human privacy and is a big conspiracy against Human Freedom. For this reason I urgently call the idea of ​​the Backup Application again to think through in detail.

      • toadAdmintoad (Admin, Freenet Project Inc.) commented  · 

        Downloading signed OS updates is a possibility, the main advantage is that it would make it harder for an attacker to know exactly what you are running on your system. However on current performance it would be slower than the mirror network...

      • Arne BabenhauserheideArne Babenhauserheide commented  · 

        I use Gentoo GNU/Linux, and I regularly download "distfiles", which are the source snapshots of the programs I update.

        Would it be possible to have freenet act as a proxy to these, so the primary servers have far lower load?

        That would also fulfill the requirement that the files have to be downloaded by many users, so the caching can work.

        Does freenet have sufficient speed for that?

      • Arne BabenhauserheideArne Babenhauserheide commented  · 

        How about caching requests on the normal web and acting as a distributed and anonymous proxy?

        This depends on peoples accesses being fairly similar, though.

        Maybe it could be restricted to certain file types which can be served well via freenet. Videos are an example I can think of.

        Any maybe torrents, though that would likely be a grey area :)

      • toadAdmintoad (Admin, Freenet Project Inc.) commented  · 

        Also you may want to talk to us on IRC on #freenet on irc.freenode.net. I am around most of the time, so are some other folks. Today I won't be there until at least 1700 GMT, but other people will.

      • toadAdmintoad (Admin, Freenet Project Inc.) commented  · 

        We would be willing to provide hosting and code review for such an experiment. Unfortunately I don't think it's going to be possible to spend any official project resources beyond that at the moment, i.e. my (paid) time. It will take some time to make this work well, but if you want some space on our SVN, let me know. Start coding, post to the tech list, get some folk to help you.

      • toadAdmintoad (Admin, Freenet Project Inc.) commented  · 

        I suggest you start a wiki page for this (on wiki.freenetproject.org). It could be written in almost any language, although an official plugin written in java with a web interface would be ideal. If you are able to start coding, I am willing to provide a basic (less reliable) random-routed-timed-request function for prototyping, as it will be a while before 0.9's tunnels are available.

      • toadAdmintoad (Admin, Freenet Project Inc.) commented  · 

        The best way to determine the popularity of data is simply to request it. Right now it would be fulfilled from your datastore and/or your neighbours' datastores, but if we random route and ignore cache for a few hops, we can avoid that. It will be a noisy signal but averaged over lots of blocks, and over lots of tries, and compared to other popular data, it should be usable.

      • toadAdmintoad (Admin, Freenet Project Inc.) commented  · 

        We already use FEC, 128 data blocks plus 128 check blocks for each segment. Obviously larger segments would reduce the likelihood of unrecoverability, but would cost 4X more CPU to encode/decode. However you may reasonably argue backup is a special case and use 16-bit codes and huge segments in that case.

      • enigmaenigma commented  · 

        k depends on the failure potability of a single block, but by calculate it from n,l and p, we can ensure a stable quality l. In case of fail, you can inform the user and suggest the usage of an older file version.

      • enigmaenigma commented  · 

        I can not write the mathematic formulas here, but for every limit l > 0 near by 0 and n > 0 exist an k > 0 where the probability of an incomplete backup restore q is less than l. This is perfect, because all other classical backup program have also q > 0, which results from the probability of block data read error on backup media devices.

      • enigmaenigma commented  · 

        But the n private date CHK Data of one save set (Data of one backup session) can be added by ECC CHK. In my opinion, ECC shout by done bitwise. Amusing you have an ECC Algorithm where you can free determinate m = n + k, which means you have n Bits, store m Bits and you can lost k Bits during recovery without get some trouble. Red Solomon code family you can use in this way, for example

      • enigmaenigma commented  · 

        Why is ii possible? Some user files have ever a modification data much more older than the date of the last backups. If statistic says "0 incarnations", you know, here is a lost! From this basis you can calculate the lost probability p<1 of a single CHK. The User may need n CHK to store his private date part. p**n is, for moderate n, very near by 0! thats what you mean, if you say, is don't work

      • enigmaenigma commented  · 

        The most easy way to implement a special handling for users files is, to store them additional encrypted to some ftp space. For smaller amount of data, such kind of stuff is freely available on the Internet, for example in association to free web space. But this is alone not sabotage secure. By evaluation of statistic functions of user private files, the program can calculate the lost propability

      • enigmaenigma commented  · 

        I see, you acknowledge, there is enough space and that popular files are save in free net. Next step is, that the backup program mus be able to distinguish between popular and not popular, to handle the private file of an user in an other way. To do so, a backup programmer needs a statistic function, to determinate the popularity of a file. By the way, this may also interest a freesite master

      • toadAdmintoad (Admin, Freenet Project Inc.) commented  · 

        I accept there will be a lot of overlap on end-users' files, however the files that overlap are precisely the files that are least valuable. Freenet itself requires a significant amount of redundancy, especially with low-uptime nodes. And it is architecturally impossible to provide any guarantees, especially given that backup is not the sole usage of the network. So what exactly is the use case?

      • enigmaenigma commented  · 

        Regular Backups, like Apples Time Machine for example, can reconstruct the situation every hour for one day, every day for one week, every week for one month and every month until medium is full. This means, backup software must be able to give a data set an out dating time, so the second hour backup on a day can store with a 24h lifetime. This gives data for a garbage collection on the nodes.

      ← Previous 1 3

      Feedback and Knowledge Base