Communication protocols

D

Dave Ferguson

I love this discussion........

As I have said and has been said by others on the list........the reason everyone loves UNIX control systems IMHO is that they have spent the time to know how to "tweak" it to be totally (?) stable but they will not devote the same time and attention to lowly NT.

I have systems running 24/7 that have never been rebooted. I also spent a large amount of time setting things up front. The bad thing is I am basically a full time IT person now because I have like 25 systems and 150 PC's out there and the IT people are not used to 24/7 response times. By this I mean, we still have hardware
crashes because of a management decision (again IMHO) to not by "hardened" hardware, but those are rare.

I was totally against control via NT but after learning it as well as I knew UNIX, I now have little to no problems.

I control the machines via user profiles so that someone cannot play solitair etc. and the interface is locked down to just running my
control software, and we actually "tested" things and tweaked them just like I had to do the first time with UNIX and we devoted the time to learn the stuff. I am now an MCSE only because I like that kind of cheap personal gratification to see how I am doing (only for myself). We also ghosted the machines to network drives so that any hardware issues and we can restore the machines in like 6 minutes.

It made me realize that different does not mean "bad" it only means different. In todays world market you better be able to accept change and other opinions or you are doomed to fail............

Adapt or dye..........and oh by the way, the Internet is not going away and why if I had to sum it up isn't it going away ?

Because I do not care that I have connected to a Unix, NT, Linux, Mainframe etc system to get my information and that it took 25 hops through routers all over the world..........I only care that I get my INFORMATION..........that is the bottom line. Before it is done, like it or not........the internet browser will be the control system of use.

I already have systems running that just gather info and spit it out to web pages for control and diagnostic information........get used to it. I had spoken at the ISA show numerous times 5 years and more ago about this "revolution" and people laughed at the time but look around.............

Computers were designed to make things easier, they are just a tool, like a hammer. If you give a hammer to me and you give one to my friend who is a Master Carpenter, I will build a stick house
and he will build a work of art. You must know how to use your tools.......

I better get off my soap box.........

Dave Ferguson
Blandin Paper Company
UPM-Kymmene
DAVCO Automation
 
C
Randy DeMars wrote:

> I haven't been following this discussion too closely, but this
> section brings up a concern that I have about PC-based control vs.
> traditional PLCs. I would like to hear some opinions and
> explanations of how any of you may be handling this.
>
> Suppose we manufacture a machine that uses PC-based control instead
> of a PLC, and send this to one of our customers. After several
> years they have a problem with part of the computer system and need
> a replacement. The original hardware is no longer available and
> current hardware does not support the software used. (We have seen
> this situation before - system ships in '92 with Win 3.0--> touch
> screen fries in '98 --> new version of touch screen does not support
> Win 3.0 --> search for compatible replacement --> engineering
> required to reconfigure system with new drivers, etc).

The problem here is the forced upgrade/termination of support from the well known vendor. Guaranteed investment distruction. This can bite you many more ways, it's just that this one is the most frustrating. With an Open Source OS this goes away. There is less
incentive to drop earlier versions and since the source is available, the new driver can be backported, or the application can be compiled on the new version. In your example the touch screen on a Linux box for example would most likely be serial and probably would just work because a Linux application would very likely use the same interface to the serial character driver regardless of kernel or serial driver version. This is not an accident, these API's are
carefully maintained for just this reason.


> I realize that the PC-based solution can offer tremendous
> advantages, but I have a hard time getting past this problem. We
> manufacture capital equipment, and our customers typically perform
> their own maintenance. Trying to get our application running on new
> hardware may be beyond their level of expertise, whereas installing
> a new PLC and downloading the program is not a problem.

This can be mitigated and avoided by burning a CD that includes the OS and the application and the configuration. Of course to be legal, this too implies an Open Source OS. In use it's just like the CD's the ship with a lot of Windows systems. When it gets too hosed up, insert CD and boot. If the only thing that is done with the machine is control, (strongly recommended) they are right back in business fast. Or just load a minimal system so you can dial in and do the rest. It is also a good idea to not ship anything that your application doesn't require. If all the system does is what it's supposed to, people don't play around. I've found that Linux helps a lot in this regard, you don't have people loading junk on the
machine. In the bad old days, it was quite common for someone to load an application on the PC and mess things up. As more people use Linux I suppose it will lose this advantage, but for now, it really cuts down on user problems. There are lots of ways to make it easy, too bad they all require some forethought:^) We sell PC based test equipment that runs on Linux and support costs have been minimal. Windows was unsupportable for a small shop like ours, we never let it out the door, our own use kept us too busy. I think this is a big part of the bad impression many have about using PC's.

> We have been shipping PLCs on our equipment since the early '80s and
> except for a few of the very old systems, spare parts are still
> readily available. I realize that some of the industrial computer
> vendors will support their hardware for a certain length of time.
> What kinds of timeframes are you seeing?

It's far better to not depend on specific hardware. If all you need is standard PC facilities and Ethernet for example, it should work on a new box. There are many apps written years ago that still load and work on new hardware. This is probably by accident. But with a little thought and anticipation it can be achieved intentionally. Standardization of the environment is a PC strong point we might as
well use it.

Regards,

cww
 
Why would you want to use DOS? I assume that your need for a vendor to provide you a TCP stack is relative to the fact that you are using DOS, given the reference to DOS in the last paragraph.

Why don't you use a operating system that comes with TCP, like Linux or NT?
 
J
I tried to resist. I really did. But this just screams at me!!!

-> I love this discussion........
->
-> As I have said and has been said by others on the list........the
-> reason everyone loves UNIX control systems IMHO is that they
have
-> spent the time to know how to "tweak" it to be totally (?) stable
but -> they will not devote the same time and attention to lowly NT.

Because everything is either hidden in the cryptic registry, or simply not available to change. For example: I want to free more
resources to do my database processing on my server. A new production line is going in, and I don't want to upgrade my server hardware, but I would really like just a little better performance on my database. I decide that the best way to free resources is to shutdown all unnecessary processes. Let's start with the biggest one: the GUI.

Linux: No seat. Change my runlevel.

NT: Errr.....

OK, so we won't shutdown the GUI. Not sure why I am forced to
have it running, though.

etc. etc. etc.

->
-> I have systems running 24/7 that have never been rebooted. I
-> also spent a large amount of time setting things up front. The
-> bad thing is I am basically a full time IT person now
-> because I have like 25 systems and 150 PC's out there
-> and the IT people are not used to 24/7 response times. By this
-> I mean, we still have hardware crashes because of a
-> management decision (again IMHO) to not by "hardened"
-> hardware, but those are rare.

So if problems are "rare", why are you basically a full time IT support person? This seems like a contradiction to me. How long have they been running 24/7 without reboot? What service pack
does that have you on? do you mean never reboot, or no unscheduled reboot? If they run 24/7 with nary a problem, what are you doing as a full time IT person? I guess I don't understand your
statement here....

-> I was totally against control via NT but after learning it as well
-> as I knew UNIX, I now have little to no problems.
->
-> I control the machines via user profiles so that someone cannot
-> play solitair etc. and the interface is locked down to just
-> running my control software, and we actually "tested" things
-> and tweaked them just like I had to do the first time with UNIX
-> and we devoted the time to learn the stuff.

So do you have to start over with Win2K? I am asking because I don't know. I have some experience working with NT Server, but not 2K of any sort.

-> I am now an MCSE only because I like that kind
-> of cheap personal gratification to see how I am doing (only for
-> myself).

Whatever turns you on. I could get one to, I am sure. I have met MCSE's that couldn't do command line redirection.

-> We also ghosted the machines to network drives so that
-> any hardware issues and we can restore the machines in like 6
-> minutes.

A good method of backup. Everyone should have one.


-> It made me realize that different does not mean "bad" it only
-> means different. In todays world market you better be able to
-> accept change and other opinions or you are doomed to
-> fail............

I COULDN'T AGREE MORE! THAT IS THE HEART OF MY
ARGUMENT! nt IS NOT THE ONLY WAY!

-> Adapt or dye

(What color?)
or is that 'die'?

..........and oh by the way, the Internet is not going
-> away and why if I had to sum it up isn't it going away ?

WHAT?!?!?!?! Where did that come from?

-> Because I do not care that I have connected to a Unix, NT,
- > Linux, Mainframe etc system to get my information and that
-> it took 25 hops through routers all over the world..........I only
-> care that I get my INFORMATION..........that is the bottom line.
-> Before it is done, like it or not........the internet browser will be
-> the control system of use.

Not true. The browser is good for replication of data reporting throughout an enterprise. The thing to realize,though, is that the WWW is NOT the internet. It is a subset. It is not the most used (E-mail) and it is FAR from being the most efficient way of transporting data. I would much rather have my app talk to your app using TCP/IP and not have to deal with all the overhead of HTML document formatting. Just give me the data, and leave out the <FONT> and <TABLE> crap.

-> I already have systems running that just gather info and spit it
-> out to web pages for control and diagnostic information........
-> get used to it.

As I said. It is good for that. Don't tell me that the web browser is the end-all be-all of control platforms though.

Although, I guess one of the good points of putting it all into a web browser is that I could run your browser apps in Netscape on Linux..... :^}

-> I had spoken at the ISA show numerous times 5 years and more
-> ago about this "revolution" and people laughed at the time but
-> look around.............
->
-> Computers were designed to make things easier, they are just a
-> tool, like a hammer. If you give a hammer to me and you give
-> one to my friend who is a Master Carpenter, I will build a stick
-> house and he will build a work of art. You must know how to use
-> your tools.......

The point is though, that the master carpenter probably owns more than one hammer. And I would even venture to guess that his tools are from more than one manufacturer. That is the point. Windows is not the best answer for everything.

-> I better get off my soap box.........

As will I.

--Joe Jansen
 
M

Michael Griffin

At 16:37 14/08/00 -0400, Dave Ferguson wrote:
<clip>
>As I have said and has been said by others on the list........the
>reason everyone loves UNIX control systems IMHO is that they have
>spent the time to know how to "tweak" it to be totally (?) stable but
>they will not devote the same time and attention to lowly NT.
<clip>
>I was totally against control via NT but after learning it as well as
>I knew UNIX, I now have little to no problems.
<clip>
My own impression is that anyone who is a genuine Windows NT expert usually also has Linux (or some other Unix) experience. They seem to make a living with Windows, but often their first love is Linux. I consider a "Windows expert" to be someone who can get to the bottom of a problem by means other than changing things at random and hoping for the best. Someone who is genuinely interested in operating systems often knows several fairly well.

I'm glad to see that you are able to "tweak" Windows to get it to do what you want. There seems to be a lot of people who have some sort of Windows certificate or "ticket", but very few who really know what they are doing.
For office systems this situation seems to be acceptable (or at least, tolerated). Given the number of computers with Windows NT operating
systems showing up in industry however, it's a shame that for most of us there seems to be so very little genuine Windows related expertise available we can draw on for difficult problems.


>I have systems running 24/7 that have never been rebooted. I also
>spent a large amount of time setting things up front. The bad thing
>is I am basically a full time IT person now because I have like 25
>systems and 150 PC's out there and the IT people are not used to
>24/7 response times.
<clip>
I'm not sure what your figures of "25 systems" versus "150 PCs" means, but it sounds like you have become indispensible to the operation of your plant. What is it you have to do with these PCs that makes them a full time job though? Repair and maintain them (hardware and software)? If so, that seems a rather expensive use of your time when you consider that you
should be able to buy at least 100 small PLCs for the cost of your annual salary alone.

**********************
Michael Griffin
London, Ont. Canada
[email protected]
**********************
 
M

Michel A. Levesque, ing.

This thread is getting more and more interesting:
But maybe we should all reflect back on the how
OPC came to be:

Everyone remembers that during the DOS and Win3.x era MMI manufacturers had to have their own set of drivers for each and every automation product that they wanted to talk to. Intouch lead the way with 300+ drivers and won the lion's share of the MMI market.

Some other PLC manufacturers provided their own drivers to talk to their own equipment via DDE (in Windows only). DDE was not really meant for data acquisition, it was too slow, unreliable for hard-core data acquisition.

So along comes OPC which promised to do away with the dreaded driver wars and standardize the way MMI's talked to PLC's and other automation equipment.

From comments on this list, some people want to use OPC to integrate MMI peer to peer communications. It seems to me that this was not intended for OPC.

Why are we trying to fit a size twelve foot into a size eight shoe? IMHO, we should use OPC to get the field data into the MMI packages. Then we can use anything else that fits better to get the MMI data out to other MMI packages, or SCADA, MES, ERP, etc.

SOAP, XML and the like are for computer program to computer program communications. I hope nobody is seriously going to use this to acquire data from field devices. If so, then we are going to see a lot of people with shot off feet.

So what is left to use to get field data into a computer program that runs on Windows? From where I stand: OPC. (Remember, at the present we are all locked into Windows because all big-name MMI's run on this platform. We are even seeing most DCS vendors jumping onto the Windows wagon.)

Michel A. Levesque eng., mcp
Directeur Bureau Montreal
AIA Inc.
[email protected]
 
R

Ralph Mackiewicz

> And that his how this thread started, lest we forgot along
> the way! Somebody bemoaning the fact that there were so many
> standards out there, and straight away people said 'now OPC is
> becoming popular as a standard'. So what is it? A class wrapper or
> a protocol?

At the risk of being redundant:

OPC is an API!!!!!!

OPC is not a protocol!!!!!!

OPC can be used to build a wrapper. Cimplicity, InTouch, Fix, etc. all have OPC wrappers (called OPC clients) that allow them to attach to other OPC servers (a wrapper for an IA protocol). But the OPC specification itself is an Application Programming Interface (API) specification.

You should ignore any OPC evangelist who tries to tell you that OPC is a protocol and can be used as a replacement for any of the myriad of IA protocols that are out there. They are, quite simply, wrong. An API is not a replacement for a protocol. If you are communicating only between Windows nodes then DCOM might be a suitable protocol for that application (maybe). But the protocol (DCOM) issue is independent of the API (OPC) issue.

> In reality users want a common protocol to go between the desktop
> and the field device,

I see no evidence that this is what users really want. It is what you want. But, in general, users don't care about the protocols. If they wanted a common protocol, they would buy a common protocol. Most users fill their need for interoperation by standardizing on a vendor, not a protocol. I'm not saying that I think this is optimal. I am saying that this is reality. As reasonably independent standards become available some users are selecting these standards instead of vendors (ie. Profibus, FF, DeviceNet, etc.). But any one of these standards is not going to solve every possible type of IA application
that is out there. The choice then becomes multiple incompatible standards v.s. a single vendor who also uses multiple incompatible
protocols but usually (not always) offers a product to interconnect them.

I too would like to see a common protocol. I think it would lower the life-cycle cost for automation substantially and thereby bring
numerous benefits to the manufacturing industry. However, a single common protocol must address a wide variety of different kinds of applications. A protocol capable of doing that will, by necessity, be complex. And, it will involve tradeoffs for any given niche application. Right now users and vendors both despise the complexity
and tradeoffs more than they despise the costs of incompatible systems. "Good enough" is the mantra today.

> What people expect of the OPC, and what many people actually think
> they are getting, is a standardised comms protocol.

I think the vast majority of people understand exactly what OPC is: an API that allows their HMIs to plug into IA comm drivers in a way that offers better performance and easier configuration versus DDE. Every OPC customer we have understands this because they have actually bought our product.

Regards,
Ralph Mackiewicz
SISCO, Inc.

Ralph Mackiewicz
SISCO, Inc.
6605 19-1/2 Mile Road
Sterling Heights, MI 48314-1408 USA
T: +810-254-0020 F: +810-254-0053
mailto:[email protected] http://www.sisconet.com
 
Randy:

There are compatibility issues with open source. Just because the source is available doesn't mean that interfaces don't change. A case in point is when Linux underwent a shared library change.
Open source just means inexpensive source it doesn't mean it was well developed or well documented.

Sam
 
R
> I love this discussion........
>
> As I have said and has been said by others on the list........the
> reason everyone loves UNIX control systems IMHO is that they have
> spent the time to know how to "tweak" it to be totally (?) stable but
> they will not devote the same time and attention to lowly NT.

Or perhaps you just have not read what is being written.........

I spend more time working with windows than UNIX, and as I have pointed out I err on the side of windows in (the many cases where either would do) because it is 'acceptable' to lose time on windows, wheras if you have problems with UNIX everbody says you should have used windows.

Little wonder then that when I do deploy UNIX it is an ideal application for this platform and I lose zero time with them. I mean zero. I mean the boxes are installed by electricians who plug it in and off it goes. I do need to set up a first case, which I can then replicate ad infinitum
just by copying the disk image and altering the IP/hostname.

Of course it could well be the case that if I used UNIX where I use windows I may have more trouble than I do with windows, but windows does lose me an awful lot of time. I do not claim to be expert on NT, but I am not too proud to seek help. I must admit I do get nervous about spending time on learning MS stuff because it keeps changing. Years ago I did put a lot of time into learning OS/2, and look where that got me.........

But that is by the by. In the areas were I do employ UNIX, there is NO MS equivalent. They have been making promises for years about OS that may be suitable for headless embedded control/networking tasks, and they have never come up with the goods. Now the commodity PC market dictates that PC hardware must be so powerful that it requires multiple fans to keep things cool. At the same time people are turning out RISC processors that offer Pentium performance on so little power that you could feed them from a linear regulator, small size, no fans, and ideal for mounting on a DIN rail. NT was originally offered on a wide range of platforms, that has steadily reduced to just one. Will the last person to promote microkernels please remember to shut down the system log.............

Oh yes, I went through the WinCE CDROM with a fine toothcomb when they first launched that, and of course that just keeps changing, we are now
at the third re-incarnation, but I am not making PDA's, thanks all the same. Perhaps I should go and study DCOM before it dissappears.

> I have systems running 24/7 that have never been rebooted.

Nobody doubts this can be achieved, but why crow about it? People expected this of UNIX long before NT came out.

Most 'new' NT features are things that have been done on UNIX, and allthougth there are still things UNIX boxes can do that NT cannot, most (myself included) think that technology has moved along such that NT can suit most IA's requirements (allthougth X windows capability would be nice).

BUT, if UNIX can do it, why deploy NT?

I think the windows desktop is very slick, and everbody knows it, but my applications are not particularly graphical, in fact the interface is often romote or non existent.

I was enthusiastic for OS/2 and then NT because they were to offer UNIX like power at NT cost. It still costs a lot to have a 1M+ user sparcstation, but it now costs less to install UNIX on an NT sized machine. So then we talk about 'total cost of deployment', well, like I said, I expect my apps to run unsupervised, but unlike Dave I am not perfect, and in any case, sometimes the customer wants changes. Well then I can enter into a UNIX box from anywhere and do anything remotely with no additional software packages. Software development? Well my IDE/de-bugger environment under UNIX is pretty much the same as the one under windows. My code is written in ANSI C++ or Python. Works on both platforms, except the serial port stuff, but that is also different with each version of windows. Oh yes, dev tools used to be a major cost issue with UNIX, now that situation has also been reversed.

'Everybody knows windows' is another argument, but rubbish. Very few people know how to set up an NT box properly even IT people (as Dave himself points out). So what happens is that people who think they know what they are doing (because it
is windows and therefore the same as their home computer), go and alter it, disastrously.

OK, I am not trying to promote the use of UNIX, like I said I use windows more than UNIX, but the reasons are on the basis of customer misconceptions, but that is no skin off my nose. But there are cases where NT (or any other MS OS)
simply does not cut it. Then I use UNIX, and I find it better, and I remain convinced that much of the work I do under windows would, from a technical and economic point of view be better off under UNIX, I use Windows for reasons of marketing and mindshare, I am not protesting,
just stating.

> I was totally against control via NT but after learning it as well as
> I knew UNIX, I now have little to no problems.

I was all for MS years ago, but over the years they have speechless.

> It made me realize that different does not mean "bad" it only means
> different. In todays world market you better be able to accept
> change and other opinions or you are doomed to fail............
>
> Adapt or dye..........and oh by the way, the Internet is not going
> away and why if I had to sum it up isn't it going away ?

Keep your mind open or die. Accepting that windows can be usefully deployed, and seriously attempting to deploy it is correct. But using NT for the sake of it is also stupid. One has to keep an open mind.

Remeber that NT, and MS as a whole, is going ever more after the client/server architectures that are ideal for corporate computing and internet connected society, yet are not adept at IA and SCADA.

BTW, Do you remember when W95 was launched. You may remember that Internet Explorer was not there. We had a button that launched a wizard
which was to connect us to the Microsoft Network. In fact MS had this plan of building their own 'internet'. They did not succeed because by the
time W95 got onto peoples desktops the real internet boom had already started whereas their network was hardly off the ground, people wanted to connect to the 'real' internet.

But had they been a bit earlier they may well have succeeded. What would you think about having an 'internet' controlled by MS? Some think it would have been better, some worse. What is your opinion?

> I already have systems running that just gather info and spit it out
> to web pages for control and diagnostic information........get used to
> it. I had spoken at the ISA show numerous times 5 years and more
> ago about this "revolution" and people laughed at the time but look
> around.............

Ummm I was actually doing this 5 years ago, I took the source code of the free NCSA server from my linux box and compiled it on an AIX workstation. The dynamic pages were generated by a KORN shell cgi script. I started after lunch and was demonstrating it to my colleagues before we went for coffee break. But I also realised that
while this is very cool, it has limited pratical application in supervision.

But note also that I can serve dynamic web pages from a little 4M flash based card running Linux, or even Datalight DOS, a box that cannot even run NT, hence I cannot understand the relevance of your comment.

It does suggest to me that you perhaps do not know UNIX so well, as TCP/IP related services have been around for a long time on UNIX and
have always been very easy to deploy.

> Computers were designed to make things easier, they are just a
> tool, like a hammer. If you give a hammer to me and you give one
> to my friend who is a Master Carpenter, I will build a stick house
> and he will build a work of art. You must know how to use your
> tools.......

And you must know how to pick the right tool for the right job, the master carpenter probably could get the wood to length by knocking bits of it with the hammer, but more likely he will select a saw from a whole range of saws, to suit different types of cut on different types of wood. You on the other hand go down to the DIY store and see a shelf full of the ACME super saw that was advertised on TV and hat your neighbor has and pick that, because it is what everybody is
using.


> I better get off my soap box.........

No, stay on it. The IA industry is in a period when it must choose OS's and protocols that will have long term implications, and most people have limited experience. Although there are few participants in this thread there are many readers. The more people air their opinion
the better, as it allows a more balanced view to be obtained.

DISCLAIMER: People pay me to fix windows generated problems and limitations, therefore I consider Microsoft to be a business partner.
 
B

Blunier, Mark

> There are compatibility issues with open source. Just because the
> source is available doesn't mean that interfaces don't change.

There are compatability issues for any software that changes. It doesn't matter if it is open source, closed source, proprietary, free, or etc.

> A case in point is when Linux underwent
> a shared library change.

This is a pretty vague example. What kind of shared library change are you talking about? When linux libraries are changed, they are to add
new API's, or fix broken API's that don't work as documented. If you are talking about the switch from libc5 to libc6, you are also spreading
misinformation. You can still use libc5 and libc6 at the same time.

> Open source just means inexpensive source it doesn't mean it was
> well developed or well documented.

This is wrong. Open source does not mean inexpensive source at all. Open source means you have the source. The source may be free, or it may come with a purchase price. You may have a license to make changes, you might not.

Open source may not be well developed, but at least you have the opportunity to look at the code and make that decision before you run it, instead of needing to run it to find out if its bad.

Closed source programs are always well documented either, and since you don't have the source its much more difficult to figure it out on your own, or find someone that can.

But getting back to the message that you responded too, to bring things back into
context, open source doesn't make new hardware work automatically, if you have the source (and license to change it), it is still possible for you to get your system to run again even if the vender won't, or won't do it for a reasonable
price.
 
C
> As I have said and has been said by others on the list........the
> reason everyone loves UNIX control systems IMHO is that they have
> spent the time to know how to "tweak" it to be totally (?) stable
> but they will not devote the same time and attention to lowly NT.
>
> I have systems running 24/7 that have never been rebooted. I also
> spent a large amount of time setting things up front. The bad thing
> is I am basically a full time IT person now because I have like 25
> systems and 150 PC's out there and the IT people are not used to
> 24/7 response times. By this I mean, we still have hardware crashes
> because of a management decision (again IMHO) to not by "hardened"
> hardware, but those are rare.

Yeah, OK, It's _always_ the hardware. C'mon, even Ballmer admits there was room for improvement. I suppose now when you go to W2K it won't even go down if you shut it off. I'll pause a moment
for the list members to reflect on their own experiences. That, by the way is why we converted to Linux, we don't want to be booters
and reloaders.

> I was totally against control via NT but after learning it as well
> as I knew UNIX, I now have little to no problems.
>
> I control the machines via user profiles so that someone cannot play
> solitair etc. and the interface is locked down to just running my
> control software, and we actually "tested" things and tweaked them
> just like I had to do the first time with UNIX and we devoted the
> time to learn the stuff. I am now an MCSE only because I like that
> kind of cheap personal gratification to see how I am doing (only for
> myself). We also ghosted the machines to network drives so that any
> hardware issues and we can restore the machines in like 6 minutes.

But, you never have to do that.

> It made me realize that different does not mean "bad" it only means
> different. In todays world market you better be able to accept
> change and other opinions or you are doomed to fail............

OK, I'll take my chances.

> Adapt or dye..(die)........and oh by the way, the Internet is not
> going away and why if I had to sum it up isn't it going away ?

Because the Internet runs on UNIX and existed before the first MS machine ever connected. And if we can keep it from being perverted with "extended" protocols and vendor specific websites, it has a very bright future. I expect _some_ company to try to take it over, but so far all they have managed to do is break my netscape and produce broken Java and html.. Inconvenient but, the Internet is still free.

> Because I do not care that I have connected to a Unix, NT, Linux,
> Mainframe etc system to get my information and that it took 25 hops
> through routers all over the world..........I only care that I get
> my INFORMATION..........that is the bottom line. Before it is done,
> like it or not........the internet browser will be the control
> system of use.

We agree completely here.

> I already have systems running that just gather info and spit it out
> to web pages for control and diagnostic information........get used
> to it. I had spoken at the ISA show numerous times 5 years and more
> ago about this "revolution" and people laughed at the time but look
> around.............

This is simply a given with *nix systems, nothing new. In fact, I'm not sure where the Internet rant is coming from, I have been a zealous advocate of the Internet and its free and open
protocols. I have even used it as an example of what can be accomplished by cooperation, even with competitors.

> Computers were designed to make things easier, they are just a
> tool, like a hammer. If you give a hammer to me and you give one to
> my friend who is a Master Carpenter, I will build a stick house and
> he will build a work of art. You must know how to use your
> tools.......

I agree here too.....I have a lot more tools and more freedom to use them.

Regards

cww
 
A

Anthony Kerstens

I find it interesting that the most verbose thread is about the platform with the most verbose languages and complicated setups.

Hail the PLC.
Hail DOS too.

Anthony Kerstens P.Eng.
 
A
Probably because his machines don't have the resources. That being said, there are many flavors of embedded linux that will run on very low end machines and provide all the capabilities that one would need.
 
R
> When it gets too
> hosed up, insert CD and boot.

Sometimes, I set the root partion up read only, and have just a small rw partion that is initialised with a clean copy of var at boot, and
contains the home directory, which contains any app configs in simple ASCII files they can print out if they want to. Or I use Python scripts, most people can figure out how to do simple mods
to that such as changing timeout values etc.
 
D

Dave Ferguson

Michael Griffin responded and here is my reply..........

By full time I mean that we we have roughly 100 PLC's (50 AB, roughly 20 Seimens, and 20 GE and 10 oddballs). We have like 25 HMI systems as well as a large DCS and another major system (ABB). These systems also are tyed to give or take on any given day 150 PC's. We also have links from all of these systems to an upper level "shop to top" system, as well as HMI maintenance
diagnostics systems. We also administer a 100 meg ethernet network of VLANS and managed switches as well as routers. We also manage an internal Intranet and 5 servers.

By "full time IT person" I mean that new users, security, HMI and DCS revisions, PLC automated backup system, server tape backups, user "help desk" issues, network management (software and hardware) etc.

Like most management people I work with and for, you seem to think that because it is "automated" means that it just sets up and runs itself. This is part of the snubbing of NT. I am not a Linux
expert. Just like I became an AB "expert", I had to become an NT "expert". I use the term EXPERT loosly because I don't think there is such a thing except in peoples minds.

What I was trying to get at is that now I have all of the issues that the "business system" IT people had 10 and 20 years ago, change management, engineering, backup and recovery, user management, security software revisons and testing, etc.

To add a loop in the field or change a calibration requires a huge outlay of personnel time that management needs to realize.

For instance to change the range of a level loop requires, the actual recalibration, documentation, DCS database revision, graphics
revisions, links to upper level range change, database change in upper level system, graphics changes in upper level system and documentation, drafting, data sheets etc. THIS DOES NOT HAPPEN AUTOMATICALLY.

Managers better wake up and realize that this gets done with my salary not 100 small PLC's.

My entire point was that NT works if you know what you are doing just like AB works if you know what you are doing or Fisher-Rosemount works if you know what you are doing or UNIX works if you know what you are doing. Usually not liking something comes from not taking the time to learn it. I like the first thing I learned usually and must remind myself to be open to CHANGE. This
problem IMHO is exasperated by the fact that there are no manuals for anything anymore, only electronic "help" files. The problem is I do not know of a "feature" I only use this if I know what
I am looking for. I like BOOKS......but that is another discussion.

In my plant I can shut the entire plant down by turning off the wrong switch gear or I can shut it down by pulling the wrong air line or
screwing up just the right control system. My point is you better know what you are doing or don't do it. NT works if you devote the time to learn it.

Dave Ferguson
Blandin Paper Company
UPM-Kymmene
DAVCO Automation
 
C
There is strong evidence that it is made better by peer review, but, I agree that there are no guarantees. At least you can look at it and find out. And you have a much better chance of dealing with issues if you have the source. Most of the libc5/glibc issues required only a recompile. Of course, you need source for that. MS users still face dll hell every day and every application installed is a time bomb. I'll take my issues verses their issues any time.

Regards

cww
 
R
> There are compatibility issues with open source. Just because the
> source is available doesn't mean that interfaces don't change. A
> case in point is when Linux underwent a shared library change.

Yes just look. You see you can choose wether to make a package a.out or ELF (which was the big change in the shared libraries). Major releases of Linux switched to ELF 5 years ago, but you can still use compile and support a.out, and you can make a completely a.out system.

The same is true of another recent major change, from libc5 to glibc.

Of course this is looking at the problem from a closed source point of view, where the problem is supporting old software in new environments. Open source offers a much more desirable solution,
re-compile the software to make a new version, or to make it run on different hardware, or just better (for example compiling with Pentium specific optimisations enabled).

Allthough OSS is genrally available in the form of pre-built binaries, they consider the main distribution to be the source, and this will
include makefiles and utilities which will automatically configure the option to best suit your platform and then build you your optimised package.

> Open source just means inexpensive source it doesn't mean it was
> well developed or well documented.

Open is not necessarily inexpensive, can cost more. As for development or documentation, that varies from software to software irrespective of
open source issues.

Mind you, there is an implication here that OSS is perhaps not as well documented, or perhaps not as well written.

I am not going to argue, just point you to the facts, for example the Linux documentation project which covers everything from individual command options througth to how the kernels works may be viewed online at

http://www.ldp.org

As for the coding, well, you can see for yourself, go look at it;-)

But how did this get into the protocols thread? Presumebly because many people confuse open standards with open software. It is like
confusing opening a door with open heart surgery, open means the same, but it is meaningless without a context;-)

For example, the telecoms industry is rigourously conservative and you can expect to thousands of dollars just to get documentation.

But their standards are completely open and interoperable and do not build on vendor specific technology. And it works.
 
R
> Yeah, OK, It's _always_ the hardware.

Just a note on 'hardware' and 'drivers'.
Device drivers actually represent the bulk of OS core code. MS mostly rely on hardware vendors to develop this code, and blame them when things break.

OTOH, even companies like 3COM and HP struggle to
produce device drivers for the latest and greatest versions of windows. I do not think they are short of skilled engineers, and I am sure they get full support from MS, yet struggle they do. Therefore there would seem to be a design flaw in the requirements for the device drivers.
 
C
Hi Roger

Just so we don't confuse folks, that was sarcasm. I have my lab populated with "bad" hardware that didn't work with MS. Redhat 6.2 repairs it all.
You guys can draw your own conclusions. PC hardware is a lot more reliable than it gets credit for because MS blames everything on the hardware. With no wild pointers scribbling on the disk you don't have mystery crashes.

Regards

cww
 
Top