All The Significant Inventions/Discoveries Were Long Ago

Discussion in 'Science' started by impermanence, Jul 7, 2022.

  1. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,697
    Likes Received:
    3,729
    Trophy Points:
    113
    Right along side those guys are the guys I'm talking about. Guarantee it. Further, that's not how every shop operates. There's a German medical supply company in my town. All of their equipment is designed programmed and tested in Germany. They break it down ship it here and reassemble an exact copy of their German factory here in the states. Right down to producing their own 50hz 3-phase to run it all. Nobody in the shop here has a say in the design. They do, however, have a small army dedicated to make sure it all remains functional and they are paid more than I am to teach them how to do it.
     
  2. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,697
    Likes Received:
    3,729
    Trophy Points:
    113
    A PLC is a factory computer. It's job is to collect data from all the input devices: sensors, buttons, control devices, etc. It's job is to process that information based on a program, and sends command signals to the output devices: motor contactors, linear actuators, valves, etc. All sorts of fun things can go wrong with that, timing, wiring, component failure, data corruption. A human can quickly diagnose and repair those types of problems in ways that are completely impractical to teach an automated system to address. Lots of times the equipment that failed is replaced with an upgraded or completely different PLC. Just teaching an automated system how to adapt to those differences takes longer that it takes for new differences to be produced.

    I can't teach a robot how to rewire every new PLC for every machine I own just in case I might have to replace the PLC in a few hours next week. I can teach a human how a PLC works and they can figure out a PLC they've never seen before in a few hours.
     
    Last edited: Jul 27, 2022
  3. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,697
    Likes Received:
    3,729
    Trophy Points:
    113
    Right now the the engineering standard is design for manufacturability. This standard considers the ability of an automated system to produce and assemble a product. It does not consider an automated system's ability to disassemble and repair a product.

    Machines don't have the ability to abstract that humans have. To recognize a thing they have to have had some record of the thing, and a record of how that thing is used.

    Humans on the other hand can free associate. If I ask you to draw a house, you'll probably draw a rectangle with a triangle on top, 2 squares for windows, a rectangle for a door, and maybe a chimney. It's not any house that ever existed ever. It's the concept of a house. That's a super hard thing to teach an AI. It's even harder to teach what can you do with a house? Live in it, flip it, rent it, run a business, burn it down for insurance, light it up for Christmas, paint it, invest in it, expand it, remodel it, throw a party, hide from the police, stay dry in the rain, heat it, cool it, shower in it, cook in it, run a vacuum, store food (I could fill this forum with all kinds of weird things you can do with a house).

    I can program a robot to click I am not a robot. I can't program it to click all the pictures that contain a live animal. That skill is what is necessary to maintain a robot.
     
    Last edited: Jul 27, 2022
    Grey Matter likes this.
  4. drluggit

    drluggit Well-Known Member

    Joined:
    Nov 17, 2016
    Messages:
    31,131
    Likes Received:
    28,599
    Trophy Points:
    113
    Hmm.. so... you entirely missed the point. I hope you feel better after the rant though.
     
  5. Mushroom

    Mushroom Well-Known Member

    Joined:
    Jul 13, 2009
    Messages:
    12,569
    Likes Received:
    2,469
    Trophy Points:
    113
    Gender:
    Male
    Wow, you must really be a kid, to get so much of the timeline of technology so completely wrong.

    You say the "computer revolution" took off in the 1990's? Try stepping back another decade.

    [​IMG]

    I can only imagine you had no idea about this. In 1983, Time Magazine announced that the "Man of the Year" for 1982 was... The Computer. And that had been growing exponentially for the previous 4 decades prior to that. 1982 was landmark because that was when the computer finally made the huge leap from the corporate world and academia to the home.

    And the LASER was commercialized way back in the 1960s. It simply took another decade or so for the price to drop so that the items that used them were affordable by the masses.

    The same with LED lighting. That is hardly new, it is technology that has been around for around 60 years. But until incandescent lights were banned there was no real reason to invest in them. They were more expensive, and have on average a shorter lifespan than a filament bulb.

    Once again, I suggest that you actually research things, instead of just making them up as you go along. Because you are way off in multiple decades in most of these things you are trying to place.
     
    Grey Matter likes this.
  6. kazenatsu

    kazenatsu Well-Known Member Past Donor

    Joined:
    May 15, 2017
    Messages:
    34,802
    Likes Received:
    11,298
    Trophy Points:
    113
    No I didn't. I just said that's when it reached its peak.
    The increase in functionality was slower after the 90s than in the few decades before it. Having to use a 90s computer wouldn't be too terrible.
     
    Last edited: Jul 28, 2022
  7. HereWeGoAgain

    HereWeGoAgain Banned

    Joined:
    Nov 11, 2016
    Messages:
    27,942
    Likes Received:
    19,979
    Trophy Points:
    113
    So you just make this crap up, don't you.

    The transistor count is what determines everything else.
    [​IMG]

    [​IMG]
     
    Last edited: Jul 28, 2022
    Grey Matter and Mushroom like this.
  8. kazenatsu

    kazenatsu Well-Known Member Past Donor

    Joined:
    May 15, 2017
    Messages:
    34,802
    Likes Received:
    11,298
    Trophy Points:
    113
    What are you talking about? I think you're just imagining things I said that I didn't.

    None of your statistics disprove anything I said.

    You're just being block-headed if you think they do.
     
    Last edited: Jul 28, 2022
  9. Mushroom

    Mushroom Well-Known Member

    Joined:
    Jul 13, 2009
    Messages:
    12,569
    Likes Received:
    2,469
    Trophy Points:
    113
    Gender:
    Male
    What?

    What in the hell are you even trying to say? Because I certainly have absolutely no idea what that nonsensical post was trying to convey.

    The difference between a computer in 1987 and 1997 was minimal. In fact, other than the CD-ROM becoming a standard item, I really can't think of a single thing that would have been "new" in 1997 that was not available in 1987.

    I can only guess that you were not around, or were in a coma during that era. Because once again, what you are saying is completely nonsensical. Hell, in 1995 and 1996 I was still pulling XT class systems from Hughes Aerospace! You apparently have absolutely no idea what computers were being used at that time. Most home users did not even give up their XT systems until those years, there was absolutely no reason to.
     
  10. Mushroom

    Mushroom Well-Known Member

    Joined:
    Jul 13, 2009
    Messages:
    12,569
    Likes Received:
    2,469
    Trophy Points:
    113
    Gender:
    Male
    Actually, it is many things. And in the late 1990's that did start to become important. But prior to that, it was all about the Operating System.

    And in 1995, that is when Windows 95 came out. And suddenly, every computer slower than an 80386SX was obsolete. Back in the era when DOS dominated, having a faster processor really mattered very little. But once everybody moved to Win95, then CPU speeds and processors became a huge selling point. And within a few years we jumped from 33 MHz to the 1 GHz range.

    But the growth of transistors in a core actually until just a few years ago followed Moore's Law from over 50 years ago. But it was not until 15 years after that prediction that the computer finally started to hit the home as an "appliance". That is why most experts think that the record for units sold of a single computer was set by one released in 1982, and was sold until 1994 and sold over 17 million units. And that would go up even more if one counted the recent retroclones made of it.

    Transistor counts were important, but most never actually cared about that. The Operating System more than anything else tended to determine when people replaced their computers. Windows 3 had some impact, but really only after 3.1, and even that was slow as many companies were still using DOS software. Windows only became dominant after 95, and every computer made that did not at least use the 80386SX was instantly obsolete. Within a year all DOS programs were pretty much obsolete, and Win 3.1 programs shortly after that.

    Of course, at that same time we had the Internet finally reaching the masses, which also made an even larger push for people and businesses to modernize. Not sure how many ever used a DOS browser, it was a nightmare experience. Or even the Internet before 1993 when the WWW went public. It was absolutely nothing like what we use today. And it gave people a reason to finally upgrade and standardized most of the "computer world", but I agree that what you stated is much closer to reality than what kazenatsu was trying to say.

    And you know he has lost, when he makes some statement about not being able to disprove what he says. There was no real "increase in functionality" in the 1990s (that was done by the late 1970s). Simply an awareness that things could actually be done on a computer. And his comment that "The increase in functionality was slower after the 90s than in the few decades before it" is absolute gibberish. So what, functionality of a 1995 80486SX was increased over a computer "a few decades before"? What, better than say the IMSAI 8080?

    [​IMG]

    Well yeah, no duh! Computers then did not even have a keyboard. Or a screen. You literally flipped switches and read the binary output off of LED lights.
     
  11. HereWeGoAgain

    HereWeGoAgain Banned

    Joined:
    Nov 11, 2016
    Messages:
    27,942
    Likes Received:
    19,979
    Trophy Points:
    113
    Well, you talked a lot about marketing and software but that all depends on the transistor count. My original statement still stands.

    The software and functionality and speed all comes back to the transistor count. The consumer just doesn't know it.

    Perhaps the users here just don't require the functionality computers have to offer now. If all you do is screw around on the internet, then you probably don't realize how seamlessly we in the engineering world and other professionals can maneuver through software. I am constantly amazed by how much more my computer can do all the time. And I don't even have to know until it takes care of things for me.
     
    Last edited: Jul 29, 2022
  12. Mushroom

    Mushroom Well-Known Member

    Joined:
    Jul 13, 2009
    Messages:
    12,569
    Likes Received:
    2,469
    Trophy Points:
    113
    Gender:
    Male
    No, not really.

    I already mentioned the top selling computer of all time. Which had a total transistor count of 3.500 transistors. Yet had a fully graphic operating system in 1986 (which was the largest selling GUI for almost a decade).

    Software is what drives the industry, not transistors. In fact, it is why for many years the "Killer App" is what drove the industry. VisiCalc, Lotus, Video Toaster, Aldus PageMaker, Venture Publisher, AutoCAD, that is what sells computers. Nobody but an idiot goes into a store and decides on what computer to buy based on transistor count.

    You are looking at something that is outside and simply the natural progression of technology. It never "drove the industry". Hell, I was still making good money until 1995 selling computers with 29,000 transistors, when the newest on the market were over a million transistors. By saying that transistor count is the most important, I could just as easily draw a comparison to die size. After all, no jump in transistors came without first improving manufacturing capabilities and using smaller dies.

    And if it was "just about the transistors", then why did the multi-processor system never really take off? Oh, I built and used a lot of them, but for very specialized work. Mostly CAD and video editing as well as servers. But throw in two processors, you double the transistor count. So if it was all about transistors, then by 2000 multi-processor systems should have dominated the market. But it did not.

    And then we jump to the 64 bit line of processors. The Athlon 64 came out in 2003, and boasted over 106 million transistors. That is over double what other comparable CPUs used. But nobody really used them, even after WinXP x64 Pro came out in 2005. That did gain traction among the power users, but nobody really started to care about 64 bit until 2007 when Vista included 64 bit support inside.

    And just to show how little "Transistor Count" can mean, I present the newest Samsung CPU. The Exynos 990. Boasting over 8 billion transistors, it is less powerful than my 10 year old desktop. SO if you look, transistors really do not matter. It just followed its own curve, regardless of what the industry was doing.
     
  13. kazenatsu

    kazenatsu Well-Known Member Past Donor

    Joined:
    May 15, 2017
    Messages:
    34,802
    Likes Received:
    11,298
    Trophy Points:
    113
    From the perspective of the average user, a computer from the mid to late 90s was not that much different than a computer today.

    You can argue the technology has advanced by all sorts of metrics, but I am talking about actual functionality from a human perspective, how it seems to the person using it, and what they can do with it.
     
    Last edited: Jul 29, 2022
  14. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,697
    Likes Received:
    3,729
    Trophy Points:
    113
    Uhh. I think you got the cart in front of the horse.

    You can't sell software for hardware that doesn't exist yet. You don't need hardware for software that doesn't exist yet. We switched from 32 bit to 64 bit because we needed a larger register to address larger volatile memory, not because Microsoft just decided to write a 64 bit OS. The early 64 bit hardware was created knowing that cheaper larger RAM would eventually need it.

    We got larger ram because the flip flops got redesigned. The flip flops got redesigned because the transistors got smaller. They also got faster allowing for higher frequency clocks. Software companies then take advantage of this to make better software.

    Hardware has to come first.
     
    Last edited: Jul 29, 2022
  15. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,697
    Likes Received:
    3,729
    Trophy Points:
    113
    By that logic there's not much difference between a model T Ford and the innovations made by the many companies that entered the market shortly after. What do you need hydraulic brakes for if a metal band around the transmission will stop a 20hp car?
     
  16. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,697
    Likes Received:
    3,729
    Trophy Points:
    113
    If I put an 80386 in front of a millennial they wouldn't get past POST before wondering why it was broken, getting bored, and throwing it in the trash.
     
  17. WillReadmore

    WillReadmore Well-Known Member

    Joined:
    Nov 21, 2013
    Messages:
    60,159
    Likes Received:
    16,507
    Trophy Points:
    113
    That could only be true if your uses haven't changed.

    But, for most people, their uses have changed in HUGE ways that they might not even recognize, as new capability can sneak up on you.

    Try watching a Netflix vid on a 1990's computer. Try getting GPS directions to your destination, or searching for the nearest gas station. GPS existed before 1990, but the military scrambled the signal for security reasons - a sign of how important DoD saw GPS. It wasn't available to civilians until Clinton ordered an end to the scrambling. Try calling someone from your car, with only 1990's tech. Internet didn't become common in homes until the 1990's. Obviously, the access to information world wide was a gigantic advance. Try getting camera film processed today - compared to the immediate availability on your phone.

    The list could go on and on just concerning what is on your smartphone.
     
  18. Mushroom

    Mushroom Well-Known Member

    Joined:
    Jul 13, 2009
    Messages:
    12,569
    Likes Received:
    2,469
    Trophy Points:
    113
    Gender:
    Male
    Which has not a thing to do with anything.

    Here, you are talking about the conceptions of individuals, in this case generally born long after the time.

    Hell, even in the 1990s I remember having to explain that an XT does not even have a POST that they would recognize. Nor a BIOS.
     
  19. Mushroom

    Mushroom Well-Known Member

    Joined:
    Jul 13, 2009
    Messages:
    12,569
    Likes Received:
    2,469
    Trophy Points:
    113
    Gender:
    Male
    Oh holy hell, do you really think that?

    Well here, let me give you a little instructional time then, my sweet summer child.

    Do you really think that MicroSoft created 64 bit computing? Oh nonsense, it had already been around for decades before either Intel or AMD decided that the time was right to migrate it to the PC desktop platform. Hell, they had even been using 64 bit processors on their server line of systems for almost a decade already before we saw them in desktops.

    And by 2005 some of those were starting to migrate down to the desktop level. But the issue was that there was no software or OS to take advantage of it yet. You had to use some variant of UNIX in order to use 64 bit processing. And yes, there were already multiple variants of UNIX that had been 64 bit capable for a decade or more. Either that, or Windows Server 2003.

    Hell, even AMD had their Opteron series running 64 bit processors by 2003. But once again, unless you were using some variant of Windows Server or Unix, you could not use that. Early Athlon 64 had the exact same problem.

    Both Intel and AMD could have released "User Level" 64 bit processors back in the 1990s, but why? There was no user base that wanted it, no code base to use it, no operating system other than server solutions and variants of Unix that could use it. And none of the horizon.

    I still remember the backchannel competition between Intel and AMD to set the "64 bit standard". And that a lot of companies were surprised that MS accepted the AMD one instead of the Intel one. And in early 2007 when Vista finally shipped with built in 64 bit support, the software finally started to arrive. I myself ran my A64 system for about 3 months on XP in 32 bit mode, there was simply no other option really. Heck, it took another year for the first game to use 64 bit architecture, that was "Shadow Ops: Red Mercury".

    I can only imagine you were not actually around back then.

    https://www.computerworld.com/article/2555303/the-64-bit-evolution.html

    Quite an interesting look back, an article actually written in 2006 at the time 64 bit was first becoming available to the masses. And the major thing it talks about is the lack of an OS to run on such systems, and almost no software that can take advantage of 64 bit hardware. However, at the time of this article MicroSoft had already stated that their next OS would not only be 64 bit compatible, but also run multiple cores on the same die. That gave the hardware companies the incentive to start pushing out the hardware finally to the consumer level. Because they knew within a year and a half the OS would be there, and the software follow shortly.

    And if Intel had not decided to include multiple CPUs on one die with Vista and treated it like previous Multiple CPU systems in the past, then that would also have largely stagnated for years to come.

    Yes, hardware is always first. But without the "Killer App", the hardware essentially dies. This has been known for decades, you do not know this? Wow, you really do not know a lot about this, do you?
     
    Last edited: Jul 29, 2022
  20. Grey Matter

    Grey Matter Well-Known Member Donor

    Joined:
    Feb 15, 2020
    Messages:
    4,432
    Likes Received:
    2,593
    Trophy Points:
    113
    Gender:
    Male
    I would say that the number one most significant discovery was sex.
    Everything else is secondary, tertiary, quaternary, etc....
     
  21. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,697
    Likes Received:
    3,729
    Trophy Points:
    113
    Oh boy. Here we go.

    No. I don't think that. You think that vista is what made it consumer accessible. That's false. It was consumer accessible hardware that made vista useful. (If it ever was). It wasn't. No self respecting business ran vista. Not many even wanted to switch to 7.

    Prior to advances in volatile memory 32 bit's ability to handle 4 gig of ram was more than what the consumer could afford. Hell, in the 90's you could get a silicon graphics rig with 64 bit registers pushing data at 600Mhz but no self employed architect would ever buy that 100k boat anchor to run autocad 14, which ran just fine on a 32bit pentium.

    You argued that software drives the industry and that's just completely wrong. It's the fact that it didn't cost a grand to put in more than 4 gig of memory that made vista popular with home users, and that's only because that's the OS that came preinstalled on the hardware they wanted. It's the fact that you didn't need to drop 10 grand on a scsi array to store an edited copy of your wedding that allowed Adobe to sell premier to the general population, not the fact that premier is so nifty.

    Who would write software for hardware that no one can afford? How does it get more affordable? Advances in technology. The more transistors argument is just an analog of the more value you get for less cost.

    Yeah you can run a 128 bit mainframe in 1975. Maybe if you're an insurance company. Not if you're a plumber.
     
  22. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,697
    Likes Received:
    3,729
    Trophy Points:
    113
    My first computer was a trs-80. I had an 8088. I had a ps2. A 386. A pentium 2. All before my wife got that crappy e-machine that I promptly wiped and installed xp.

    I wouldn't trade any of them for my i7 with ssd.
     
    Last edited: Jul 30, 2022
    Grey Matter likes this.
  23. WillReadmore

    WillReadmore Well-Known Member

    Joined:
    Nov 21, 2013
    Messages:
    60,159
    Likes Received:
    16,507
    Trophy Points:
    113
    All along, Microsoft had a close relationship with Intel concerning their chips. As you point out, their chips needed both an operating system and applications - both of which Microsoft supplied - apps in the form of tools and migration help for the software industry as well as compatibility strategies so 32 bit apps were supported as is and had tools to help migration. This took whole groups at MS, working with software vendors. After all, the majority of end users buy computers based on applications, not operating systems.

    Plus, Microsoft knew of other requirements that customers find important, and need chip level support. For example, memory management and boot time, requiring coordination between OS writers, application writers and chip designers. Video editing, etc., has huge memory requirement and that is NOT solved simply by putting a bunch of memory into the machine.

    At that time, Apple was still using their own chips, and MS owned pretty much of the rest of the OS market for Intel. Nothing was done to make Intel chips MS specific, as is noted by Apple adopting those chips.
     
  24. WillReadmore

    WillReadmore Well-Known Member

    Joined:
    Nov 21, 2013
    Messages:
    60,159
    Likes Received:
    16,507
    Trophy Points:
    113
    Right. The reason they bought that computer was to run Autocad, or Adobe, or ??
     
  25. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,697
    Likes Received:
    3,729
    Trophy Points:
    113
    A home consumer? Neither. AOL myspace gaming and porn.

    A business owner, on the other hand doesn't buy software that won't run on existing hardware.

    Hardware comes first...
     
    Last edited: Jul 30, 2022
    Grey Matter likes this.

Share This Page