ALINK="#FF0000">

(?) Some juicy rants from The Answer Guy

Snippings Provided By The Wizard's Hat

From Billy a.k.a. CustomerMarket

(?)Hi All

I am trying to configure my two computers With Linux and Windows 2000 into network. I am using DSL modem and router. I would really appreciate if somebody can spare a few ideas because I am on verge of breaking my head. (Not literally thou)

Thank you all
Billy

(!) [Wizard Hat] Okay. You install and configure Linux and connect it to your network. Then you install MS Windows 2000 on the other computer and connect it to your network.
I'm going to make a wild ass guess that your DSL modem/router is doing IP Masquerading (a particular form of NAT, network address translation) and it problem offers DHCP services on it's "inner" (or LAN, local area network) interface --- leasing out a set of RFC1918 "reserved" addresses (192.168.x.*, 10.* or 172.16.*.* through 172.31.*.*). So, you can probably configure both computers to just get their networking information from the router dynamically (automatically).
The exact details of configuring your router, and W2K for this are beyond our purview. Talk to your ISP or refer to the router's documentation for the former. Call Microsoft or find a Microsoft-centric support forum for the latter.
The precise details of configuring Linux to use DHCP depend on which distribution you use. In general the installation programs for mainstream distributions will offer this option in some sort of dialog box or at some sort of prompt. That's the easiest way of doing it (easiest meanining: "requiring the least explanation in this e-mail"). You haven't said what distribution you're running, so I couldn't offer more specific suggestions without having to write a book.
This all seems pretty obvious. I suspect that you have some other needs in mind. However, we haven't installed the telepathy protocol daemons in our little brains yet. So we can't hazard a guess as to what you mean by 'configure.'
I might gues that you want to do file sharing between the two: read a book on Samba to let Linux export/share some of it's disk space (filesystems and directories) to the MS Win2K system and perhaps looks for a chapter or so on smbfs for Linux to "mount" (access) shares from the W2k system (i.e. to go the other way).
I might guess that you want to access your Linux system, particular it's command line interface from your Windows desktop system. In that case download and install Putty (the best free ssh client, I would say the best ssh client all around, for MS Windows). That will allow you to "ssh" into your Linux system (open command prompt windows to administer it and run programs there from. You might even want to remotely access graphical Linux programs from the Windows box (or vice versa). In that case you'd probably want to look into VNC (virtual network computing --- actually a rather silly name). VNC clients and servers run under Linux (and other forms of UNIX) and MS windows, and there is a Java client that can even run from a web browser.
There are numerous other ways to do each of these, BTW. You could install NFS clients on the Windows side for filesharing (those were all commercial last I heard). You could use the MS Windows telnet clients and install and configure the deprecated (as in "insecure, use at your own peril) telnet service (daemon) on the Linux side for character mode (terminal and command line) access. And you could get X servers for MS Windows --- most are commercial, and/or you could run rdesktop for Linux to access the MS Windows "Terminal Server" features (however the Terminal Services are an expensive add-on for Windows, as far as I know). In other words, Samba/smbfs, Putty/ssh, and VNC represent a set of services that provide file, command, and remote graphical support between the two systems using only free software and well known software at both ends.
I might provide more details on how these packages could be used. However, each of these is just a shot in the dark at what you might be looking for. So I've spend enough time on the question.
Here are a few URLs you can use to read more about these packages:
Please note: anything I say about MS Windows is likely to be wrong. I haven't used MS Windows regularly for almost 10 years. At the last couple of places where I worked or contracted that put MS Windows systems on my desk (to access Exchange for their e-mail and groupware/sheduler functions) I found that I barely used them --- e-mail, browser, and PuTTY were as much as I ever used on any of them. I'm almost exclusively a UNIX/Linux administrator and programmer, so I deeply lost touch with the whole Microsoft based universe.

This was posted in the open forums attached to "Langa Letter" -- one of the InformationWeek regular columns. The Answer Guy's actual reply is what's sitting here in my clippings-box; the column which he is replying to was
   Fred Langa / Langa Letter: Linux Has Bugs: Get Over It / January 23, 2003

Fred's comment about "severity" is, as he points out, inherently subjective. His numerical analysis is also subject to more issues that he's simply ignoring.

For example the 157+ bug count for RH 7.2 or 7.3 includes fixes for many overlapping products and many which are rarely installed by Linux users -- RH simply includes a lot of optional stuff. Meanwhile the count for Micrsoft may still be artificially low, since MS is known to deliberately minimize the number and severity of their bug reports. Many of their 30+ reported patches might include multiple fixes and descriptions which downplay their signficance.

Fred also, inexcusably, argues that "first availability" of a fix (in source form, sometimes in focused, though public, mailing lists and venues) "doesn't count" as faster. That is simply jury rigging the semantics to support a prejudiced hypothesis.

Another approach to looking at the severity of bugs is to view the effect of exploits on the 'net as a whole.

In the history of Linux there have only been a couple of widespread worms (episodes where a bug's exploit was automated in a self-propagating fashion). Ramen, Lion and Adore are the three which come to mind.

Subjectively the impact of these were minimal. The aggregate traffic generated by them was imperceptable on the global Internet scale. Note that the number of Linux web, DNS and mail servers had already surpassed MS Windows servers by this time --- so the comparison is not numerically outrageous.

Compare these to Code Red, Nimba, and the most recent MS SQL injection worms. The number of hosts compromised, and the effect on the global Internet have been significant.

I simply don't have the raw data available to make any quantitative assertions about this. However, the qualitative evidence is obvious and irrefutable. The bugs in MS systems seem to be more severe than comparable bugs on Linux systems.

If a researcher were really interested in a rigorous comparison, one could gather the statistics from various perspectives --- concurrently trying to support and refute this hypothesis.

Fred is right, of course, that Linux has many bugs --- far too many. However, he then extends this argument too far. He uses some fairly shoddy anecdotal numbers, performs trivial arithmetic on them and tries to pass this off as analysis to conclude that there is no difference between MS XP security (and that of their other OSes) and Linux' (Red Hat).

I won't pass my comments off as anything but anecdotal. I won't look up some "Google" numbers to assign to them and try to pass them off as statistical analysis.

I will assert that Linux is different. That bugs in core Linux system components are fewer, less severe, fixed faster, and are (for the skilled professional) easier to apply across an enterprise (and more robust) than security issues in Microsoft based systems.

The fact that numerous differences in these to OSes make statistical comparison non-trivial doesn't justify the claim that there is no difference.

Further anecdotal observations show that the various Linux distributions and open source programming teams have done more than simply patch bugs as they were found. Many of the CERT advisories in Linux and elsewhere (on the LWN pages, for example: http://www.lwn.net/ ) are the result of proactive code auditing by Connectiva, Gentoo, S.u.S.E., IBM and The MetaL group at Stanford, among many others. In addition many of these projects are signficantly restructuring their code, their whole subsystems, in order to eliminate whole classes of bugs and to minimize the impact of many others. For instance the classic problems of BIND (named, the DNS server) running as root and having access to the server's whole filesystem used to be mitigated by gurus by patching and reconfiguring it to run "chroot" (locked into a subdirectory tree) and with root privileges dropped after initial TCP/port binding (before interacting with foreign data). These mitigations are now part of the default design and installation of BIND 9.x. Linux and other UNIX installations used to enable a large number of services (including rsh/rlogin and telnet) by default. These services are now deprecated, and mainstream distributions disable most or all network services by default and present dire warnings in their various enabling dialog boxes and UI! s). before allowing users to enable them.

These changes are not panacea. However, they are significant in that they hold out the promise of reducing the number and severity of future bugs, and they artificially inflate recent statistics (since the majority of this work as been over the last two or three years).

Fred will undoubtedly dismiss these comments as being more "rabid advocation" by a self-admitted Linux enthusiast. He may even point to MS' own widely touted "trustworthy computing" PR campaign as evidence of a parallel effort on "the other side of the Gates." However this message isn't really written to him.

It's written to those who want to make things better.

The real difference between security in MS and in Linux is qualitative rather than quantitative. With Linux every user and administrator is empowered to help themselves. Every one of us can, and many more of us should, accept a greater responsibility for our systems and their integrity and security. Linux users (including corporations, governments and other organizations) can find and fix bugs and can participate in a global community effort to eliminate them and improve these systems for everyone.

Let's not get wrapped up in blind enthusiasm and open source patriotism. But let us not fall prey the the claim that there is no difference. There is a difference and each one of us can be a part of making that difference.



Copyright © 2003
Copying license http://www.linuxgazette.net/copying.html
Published in Issue 89 of Linux Gazette, April 2003
HTML script maintained by Heather Stern of Starshine Technical Services, http://www.starshine.org/


[ Table Of Contents ][ Answer Guy Current Index ] greetings   Meet the Gang   1   2   3 [ Index of Past Answers ]