Results 1 to 4 of 4

Thread: Interesting

  1. Default Interesting

    You may have already seen this, but i thought it was interesting.

  2. Default

    What it means is that we have fucking shot ourselves in the foot and need to prepare to get back on top if we wish to continue to exist. Recently the memresister has been finished and is already being prepped for manufacturing this year. This new type of chip actually increases in strength at smaller sized making it perfect for nano technology. In addition, this chip can preform calculations and store data at the same time and can retain the data when power is gone. This allows it to be the first hardware that could support AI technology in the long run and has the ability to learn. In basic form you will see it as a much faster and more reliable version of SSD tech and later as an advanced form of ram. Around that same time you should see phone size devices using the memresister to store and compute data in crazy small spaces at large sizes exceeding 200GB in the size of a dime.

    It is crap like this that makes me think that perhaps I should start looking into politics because these noobs are never going to actually fix shit and get America back on the pace our founders started us on.

  3. #3


    One of the most depressing things I've ever

  4. Default

    Some of this is a bit silly. The top jobs in 2010 didn't exist in 2004? Apparently one of those jobs is sending information back to 2008 about that fact.

    There is a projection problem on a lot of that. As far as jobs using "skills that don't exist", funny thing about that. That almost never happens. What happens is jobs pop up using skill sets that weren't combined before.

    Example: in the early 90's web development started. Web designer was the new job. But what was its skill set? A combination of graphic design and programming, two established skill sets that were now needed together.

    The internet only took 4 years to reach a market of 50 million? What are they smoking? What starting point are they using? Hardware-side, it took a hell of a lot longer than that. (Notice, they even say the total number of Internet devices was 1,000 in 1984, and a million in 1992. Apparently, those million were already out there in 1988, and were each being shared by 50 users. That, or the internet COULD NOT have reached 50 million in 4 years.)

    IPod and facebook don't even count, as they were leverages of existing mediums. An iPod is a specific commercial product line, it should only be compared to something like the Sony Discman or Walkman, other products that grew rapidly by being major corporate releases of a refined existing technology.

    All that unique information generated this year? I'm pretty certain there is a sampling bias, considering that they counting personal digital photography and text messaging in that (How unique is "OMG, ROFL" anyway?) but they probably are not counting personal photography on film or written letters and diaries of past years. It stacks up, especially if you demand the information be stored not just as text but containing the nuances of handwriting in imaged formats. And even more, I would bet they are considering digital imaging projects as part of current totals. If I am on a project scanning and digitizing manuals for KC-135 refueling planes, is the data generated NOW or in the 50's?

    And these guys really have no idea how college works. So what if half of what you learn in college is outdated when you graduate? The point of being there is learning how to find out what you need to know and apply it. Research. I don't need a worker who knows everything as long as they can find out anything. Secondly, that number presumes that new technical information completely supplants previous technical information. Wrong. There are core basics that never change. So what if I don't know the technical details of a chip used in model XYZ camera? The guys who work on it will learn those things as they need them, and they will be aided by experience in previous projects that used similar principles. The ability to apply research NEVER becomes obsolete. Ever.

    By 2013 a supercomputer will be built that matches computational power of human brain blah blah. That was supposed to happen in 2000, after they moved it back from the early 90's, after they moved it back from the early 80's, after they moved it back from sometime in the 70's. See a trend there? Usually we end up finding properties of organic brains we didn't know before and can't match.

    DID YOU KNOW? that predictions are not facts, and the music used for this is an outdated presentation cliche?
    "But it's just a game."
    "So's blackjack. Go cheat in a Moscow casino and when you get caught tell the mobsters it's just a game. They have great sense of humor, you'll have a fun story to tell your future children. Who will have to be adopted, after the little prank the mob does to you in return."

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts