ONE HOUR Linux SESSIONS: 2014-2018
sessions in chronological order going back to 2014

sessions continue at:
http://johnmeister.com/linux/Intro-to-Linux/One-Hour-Linux-Sessions-2019.html


2018


JohnMeister.com      Today's Date: 


Remember: SAW - simple always works
Too many special characters strung together and most of us Linux lovers won't remember or understand.
The goal here is to understand and remember.
The intended audience of this material are power users, Sys Admins, Analysts, Engineers, and Web Masters.
Programmers may pick up a few tricks; but anything they share will likely far exceed our limited comprehension buffers.

NOTE: LinuxMeister dot net is expired, to access files use this path: http://johnmeister.com/linux/


sessions continue at:
http://johnmeister.com/linux/Intro-to-Linux/One-Hour-Linux-Sessions-2019.html


The Art of Linux System Administration - on Amazon and O'Reilly Media

sessions continue at:
http://johnmeister.com/linux/Intro-to-Linux/One-Hour-Linux-Sessions-2019.html


Session #140 - November 30, 2018 -

Session #140 Linux at Lunch - 11/30/2018 - LAST SESSION 
 SESSION #1 was held on November 26, 2014   
this will be the last session, at least for 2018...  
    ending this series at #140 on Nov 30, 2018  - 4 years and 4 days... 
 
THIS SESSION:
- using cygwin on windows for file management and regular expressions
             
             http://johnmeister.com/linux/Scripts/Engineering/Get-a-CLUE-w-Linux.pdf
- using time to see how well cygwin performs (cf. w/linux if possible)   (time )
- fix-filenames:  a script to undo bad habits
    
    http://johnmeister.com/linux/Scripts/fix-filenames.sh.html
- determine disk hogs  (du -sh) and other file management tricks of sys admins (that create more problems later! :)
- find large files and move, remove, etc.  (find ....) (page with examples is AWOL... !!!)
       
       http://johnmeister.com/linux/Commands/find-copy.html
- using script to capture tasks you'd like to make into a script  (script name.RAW; cat name.RAW | col -b > name.txt )
- using windows command in cygwin 

Session #139 - November 9, 2018 -

Session #139 Session #139 - November 9 - Linux at Lunch
This SESSION: crontab, wrapper scripts and backup techniques

- wrapper script to create a copy of a file with a sortable date prefixed to file name 
    http://johnmeister.com/linux/Scripts/using-DATE-prefix-or-suffix.html 
    http://johnmeister.com/linux/Scripts/wrapper-script.html 
    http://johnmeister.com/linux/Scripts/wrapper-copy-w-date.sh.html 
    http://johnmeister.com/linux/Scripts/filedate.sh.html 
- identify a list of key files: e.g. .bashrc, .vimrc, .viminfo, .ssh/authorized_keys, /etc/hosts, /etc/passwd, etc.
        http://johnmeister.com/linux/Scripts/Replace-README-using-find.html
        http://johnmeister.com/linux/SysAdmin/Editing-Apache2-etc-default-server.conf.html
- create script to check key files and create a dated backup copy of updates in ~/bin/KEY
        http://johnmeister.com/linux/Scripts/LOOPS-shell-examples.html
        http://johnmeister.com/linux/SysAdmin/system-check-script-July-2012.sh.html
- create rsync script to sync directory of backup copies with another system
        http://johnmeister.com/linux/Scripts/rsync-directories.sh.html
        http://johnmeister.com/linux/SysAdmin/Using-rsync.html
        http://johnmeister.com/linux/Notes/RSYNC-cmd.html
        http://johnmeister.com/linux/SysAdmin/SysAdmin-scripts-n-tools.html
- incorporate scripts to check key files and rysnc into crontab
        http://johnmeister.com/linux/Notes/crontab-email-setup.html
        http://johnmeister.com/linux/Notes/wget-n-crontab.html
        http://johnmeister.com/linux/SysAdmin/Cron-Rsync-sync-up-drives.sh.html
        http://johnmeister.com/linux/FileSystems/man.rsync.txt
- demonstrate how to use tar and compress (etc.)  to create an additional backup copy to be archived
        http://johnmeister.com/linux/FileSystems/creating-TAR-n-ZIP.html
        http://johnmeister.com/linux/Commands/MAN/man.tar.html
        http://johnmeister.com/linux/Notes/how-to-TAR-home.html
        http://johnmeister.com/linux/SysAdmin/UNIX/OldTapeBackup-ufsdump-restore/

Session #138 - October 26, 2018 -

Session #138 Linux at Lunch - 10/26/2018 - 
Basics of RAID - (redundant array of inexpensive disks):
 
 http://johnmeister.com/linux/FileSystems/RAID-info.html

logical volume manager on Linux will allow you to use JBOD for RAID:
 
 http://johnmeister.com/linux/FileSystems/setup-LVM.html 

using more than one computer at home and with external drives connected via USB is a reasonable backup method: 
will discuss different scenarios, and using rsync: 

http://johnmeister.com/linux/Scripts/rsync-directories.sh.htm

Session #137 - October 12, 2018 -

Session #137 Linux at Lunch - 10/12/2018 - review environments and scripts, using cygwin for windows commands
FRIDAY'S SESSION:
    - discuss environments (.bashrc)   http://johnmeister.com/linux/basic-setup-for-BASH.html
    - review scripts and discuss:   http://johnmeister.com/linux/Scripts/
    - discuss cmd in cygwin:   http://johnmeister.com/linux/Microsoft/CygWin-when-penguins-are-not-allowed-to-roam-free.html


Session #136 - September 28, 2018 -

Session #136- September 28, FRIDAY: READING a file into another file...
SITUATION RESOLVED: 1,189 web pages have been updated.
Linux at Lunch - Session #136 - 9/28/2018 - completed file insertion using SED and SHELL

FRIDAY'S SESSION:

READ a file into another file... COMPLETED...  
1,189 web pages NAVIGATION added (by Monday night!)...

review and discuss the process and tools used.  
Show how using and editing the history allowed for faster turnaround by adding commands via semicolon on one line.

http://johnmeister.com/linux/Scripts/SED-used-to-insert-Table-of-Content-in-HTML-pages.txt

http://johnmeister.com/BIBLE/STUDY 


Session #135 - September 21, 2018 -

Session #135- September 21, FRIDAY: READING a file into another file...
SITUATION: Have 1,189 web pages that I'd like to add NAVIGATION in to. 

However, there are 66 groupings of these pages with different navigation. 

So I need to create a custom navigation for each of these 66 groupings - 
have a basic set of internal vi sed commands. 

Once that's created I need to insert that navigation, or more correctly, 
a Table of Contents, into each of those like pages. 

Then scp or rsync my local system with my server and test. 


1) I have a manual process: 
    1) vi *files-with-same-name* 
    2) <esc> / key words 
    3) dd ; <esc>:r toc-filename 
    4) <esc>:wn 
    5) repeat until fingers bleed then scp -rp <updated files> server:/path

2) someone said PYTHON, his name was NOT monty...  so... with his help I got it working:
        1) 1 created the toc (table of contents) file, then found the key word search string
        2) dummied up files with simple names and tested - worked
        3) attempted to insert a simple for loop for *files-with-same-name*, failed
        4) BRUTE FORCE SCRIPT worked:  http://johnmeister.com/linux/Scripts/Python-script-attempt-without-a-loop.txt
        5) the use of python without knowledge, training or a clear cheat sheet was MORE WORK

3) someone said SED... it wasn't Fred... but it was said by more one... so...
        1) got some basic stuff working...
        2) can find the pattern and read the toc file... however, it doesn't output the original file to stdout
        3) I could not get sed to edit in place with the pattern as stated
        4) am reading the sed&awk book by oreilly and hoping to write a sed script before Friday
        5) one possibility is to replace the complex with PERL, then...
        6) sed '/banana/r file2' file1 > newfile
        7) GOOD NEWS at 23:13:38 19 Sep - PROBLEM: needed to add ".bak" to the -i in MacOSX BASH / SED!!!
                  Once I escaped all the special characters I still got this error: 
                sed: 1: "mal.html": invalid command code m     
                SEARCH pointed out the problem was a "safety feature".
     sed -i.bak '/\<br\>\<center\>\<hr width=\"50%\"\>\<\/center\>\<\/td\>/r toc-mal' mal.html

       8) now I can write a for loop to get through all of the groupings!
       9) thanks to the folks who said to use SED.    :)
      10) one other note... had to add one more line to remove the string that was supposed to be replaced:
     sed -i.bak '/\<br\>\<center\>\<hr width=\"50%\"\>\<\/center\>\<\/td\>/d' mal.html

4) PERL still seems like a viable tool...
        1) however, my perl scripting class is long expired (1998)
        2) but... I can use my perl -pi -e 's/complex string for/ banana/g'  files   
            - and then use the sed line from #6 above.
        3) I'm going to have my nose in the O'Reilly book on sed & awk and see what h and H and r really do in sed.
        4) Considering the damage I did with perl on prior sessions, I'm stopping at SED this trip.
        5) although perl is still in my tool box, just fixed an error where I had "Revelation" in the title of Isaiah!

        --> grep Revelation *Isa*     #  oops...  all 66 books had it!  didn't edit my template properly
        --> perl -pi -e 's/Revelation/Isaiah/g' *Isa*
        --> grep Revelation *Isa*     #  all gone... 

ONE THING I'VE LEARNED is that a bunch of my really smart friends will say, oh, just use this or that... most of them just leave you with that, or say RTM... or they could do it with this or that tool... and let you research it. I've heard and experienced this for decades... it's very common for a developer or advanced sys admin to say these things, it's another for those helpful and knowledgeable folks to actually help with the exact syntax and make it work.... generally it is never as easy as any of us thought. But every little bit of info, advice and suggestions helps. Have a friend back east who has been helping me with Python... he's run into the same issues I have, but worked it through... he's promised to show me how to setup a for loop tonight (Wed)... I'm trying to make a generic script that does this and just change the variables. I have manually edited those 1,189 files several times... and have ideas for more enhancements. :) Despite my lack of coding skills, the web site is coming along nicely.

Session #134 - September 14, 2018 -

Session #134- September 14, FRIDAY: SIMPLE ALWAYS WORKS!
TWO THINGS TO SORT OUT:
    1) ERROR:   xargs: unmatched single quote; by default quotes are special to xargs unless you use the -0 option 
    2) difference in counts between xargs and -exec when using the same command 
         (...and of course any differences with grep -i and escaping ".") 

Once again, upon searching with this error I find myself once again in 
     the WASTELAND of FORUMS with misinformation and unanswered questions.
Leaving me to setup a known situation and case and testing the variables myself. 
Will also try to do this in Cygwin so we can also use find and perl on Windows... 

HOWEVER, one of the lunch time linux folks suggested finding the rogue file, found it, fixed it and updated server...

------------------------------------------------ 
## using xargs gave an error, and a slight different count than using -exec  
------------------------------------------------ 
--> time find . -type f -name "*.html" | xargs grep -i "johnmeister.com" | wc -l 
xargs: unmatched single quote; by default quotes are special to xargs unless you use the -0 option 
52843 
   real 0m10.360s 
   user 0m5.439s 
   sys 0m0.321s 
------------------------------------------------ 
--> time find . -type f -name "*.html" -exec grep "wagoneers\.com" {} \; | wc -l 
61947 
   real 3m29.839s 
   user 0m53.282s 
   sys 0m25.615s 
------------------------------------------------ 
Here's the steps used to sort out the single quote thing and update the server:

    
    http://johnmeister.com/linux/Scripts/Using-find-perl-to-fix-wagoneers.com.html

---------
UPDATE FROM SESSION:
find tmp -type f -name 'sed-xargs-test.txt' | xargs sed -ir 's$before$after$g'
    sed with a "-i" = in place editing
    sed with a "-r" = extended regex   (more perl compatible) (still must be carefull with parens!)

the find with parens destroyed thousands of files... must escape parens:  \(  \)

NOTE:
sed 's/[a-z]*\(-processing\)/post\1/' <<


Session #133 - September 7, 2018 -

Session #133- September 7, FRIDAY: SIMPLE ALWAYS WORKS!
1) using grep, find and perl... 
   or how to fix 3,359 web pages in 8 minutes and 40 seconds. 
- 
http://johnmeister.com/linux/Notes/Perl-to-fix-incomplete-URL.html 
- 
http://johnmeister.com/linux/Commands/PERL-examples-2018-09-Sep-06.txt
- will fix some entries on all the pages in real time, e.g.: 
   a) grep for suspect string, pick a file:   
     grep "Cor rinthian" *   # 56 files were found with this error.
     1091_47_2_Corinthians_13_nas.html:<th colspan="3" bgcolor="aaccff">  2 Cor rinthians 13 <!-- DAY and VERSE info here --> </th>
   b)  WHAT is a good idea at this point is to make a COPY of the file in tmp as a precaution!
     cp 1091_47_2_Corinthians_13_nas.html /tmp/save-for-a-bit   # (history will help you remember the real name)
   c) select one file, as noted above, try perl string, grep again or edit file to validate,
     perl -pi -e 's/Cor rinthians /Corinthians /g' 1091_47_2_Corinthians_13_nas.html
   d) test: grep "Cor rinthian" 1091_47_2_Corinthians_13_nas.html    # expected response is none... nothing... "silence is golden"
   e) now, recall history and change the file name 1091... to a wildcard, or restricted wildcard, e.g. *Cori*
   f) perl -pi -e 's/Cor rinthians /Corinthians /g' *
   g) at this point there are two other directories that have related files, would cd ../OTHERDIR1, recall command via history, then cd again
   h) while not completely necessary, removing the file from /tmp (or where ever you parked it) is a good idea.

2) combine the use of find to fix or change various strings in the text, for example:
   # grep "" *     # two files came up in the subdirectory
   # perl -pi -e 's///g' bashrc-n-history-details.html    #  this fixed one file, there's one left
   # find . -type f -name 'setup-diverse-systems-prototype.sh.html'  -exec perl -pi -e 's///g' {} \;

grep "a string" *     # two files
perl -pi -e 's/a string//g' bashrc-n-history-details.html 
grep "a string" *    # just one file
find . -type f -name 'setup-diverse-systems-prototype.sh.html'  -exec perl -pi -e 's/a string//g' {} \;
grep "a string" *    # none found

   At this point we change the string to hit all html files and go to the top of each directory where this might exist.

GOOD IDEA TO TEST FIRST:
find . -type f -name '*.html'  -exec grep "a string" {} \;
  <img src="http://johnmeister.com/pix/practical-suggestions-for-Microsoft-Windows-small.jpg" width="86%"><br></a>
THEN EXECUTE REPLACEMENT OF STRING:
find . -type f -name '*.html'  -exec perl -pi -e 's/a string//g' {} \;

--> time find . -type f -name '*.html' -exec grep "even free" {} \; real 4m17.062s user 0m57.338s sys 0m31.313s SEE ALSO: http://johnmeister.com/linux/SysAdmin/How-to-update-18-thousand-pages.html for use of xargs or just find... 3) time permitting, review the work of a couple years' using wget, perl, sed, grep, awk, vi, html and find... to build a complete website that works with any browser or device and is portable and stands alone.


Session #132 - August 17, 2018 -

Session #132- August 17, FRIDAY: SIMPLE ALWAYS WORKS!   

1) dd of 8TB - took 16.36 days... not doing THAT again. :) 
http://johnmeister.com/linux/FileSystems/usingParted8TB.html 
http://johnmeister.com/linux/FileSystems/rescuing-failing-drive-w-dd.html 
------------------------------------------------ 
root@LOVING-LINUX -root- [/root]
------------------------------------------------
--> dd if=/dev/sdc of=/dev/sdd
------------------------------------------------
dd: reading `/dev/sdc': Input/output error
43184776+0 records in
43184776+0 records out
22110605312 bytes (22 GB) copied, 4404.01 s, 5.0 MB/s
------------------------------------------------

2)   rsync script:

3) sequentially number files - had to do a bit of manual processing... 
        http://johnmeister.com/linux/Scripts/number-file-names.sh.html, 
had to  number files from 1 to 1,189 to get:  http://bibletech.net from http://bibletech.net/READ

4) used paste, head, cut, cat, and a few other commands to merge several separate files into 1,189 html files.  

5)  Reordered the daily reading to follow the Hebrew version's ordering of these texts... 
http://bibletech.net/Mikrah  - looking at it I need to create an index file for it... 
that will involve:   ls > index.html ; vi index.html  ;  :%s$.*$ & $g  ;
and then will need to put them into various sections, add headings and insert the table format associated with my base README.html template.
Which is mostly laid out in this script:   
http://johnmeister.com/linux/Scripts/mk-webpage-2015-05-08.html  this creates an "ALL.html" page,


Session #131 - July 27, 2018 -

Session #131- July 27, FRIDAY: SIMPLE ALWAYS WORKS!   

1)migrated from 4TB to 8TB - 
    used parted, http://johnmeister.com/linux/FileSystems/usingParted8TB.html 

2) then used rsync and dd - rsync from 4TB to 8TB took over a day... dd has been almost a week... not done... 

3) using script to rsync and update files between computers - 
      strategy for use including using external drives kept in air tight metal box. 

4) modifying a script to sequentially number files: 
  http://johnmeister.com/linux/Scripts/number-file-names.sh.html number files from 1 to 1,189 
  http://bibletech.net/BY-CHAPTER/ and then create an index that matches what's already encoded! 
  Will be using awk, sed, and likely perl. 
 Some of the scripts will be on cygwin, others on SuSE. 

other scripts useful for archiving: 
http://johnmeister.com/linux/Scripts/wrapper-copy-w-date.sh.html 
http://johnmeister.com/linux/Scripts/wrapper-KSH-experiment.html 
http://johnmeister.com/linux/Scripts/wrapper-script.html 
after you get a wrapper script or PROCESS in place, you can expand it across systems 
     and do some things with ssh/scp and rsync: 
http://johnmeister.com/linux/Scripts/rsync-directories.sh.html 
http://johnmeister.com/linux/Scripts/run-now-on-all-hosts.sh.html 


Session #130 - June 15, 2018 - "home brew" revision control

Session #130- June 15, FRIDAY: SIMPLE ALWAYS WORKS! - "home brew" revision control and file management 

Talking with a few of the regular attenders I feel inspired to discuss the creation of a "home brew" revision control system. 

I've "cooked" up a few things over the years to manage system files and notes. 
Keeping it simple keeps it portable. 

Simple is what you can remember... YMMV. 


http://johnmeister.com/linux/Scripts/wrapper-copy-w-date.sh.html 

http://johnmeister.com/linux/Scripts/wrapper-KSH-experiment.html 

http://johnmeister.com/linux/Scripts/wrapper-script.html 

then, after you get a wrapper script or PROCESS in place, you can expand it across systems 
and do some things with ssh/scp and rsync: 


http://johnmeister.com/linux/Scripts/rsync-directories.sh.html 

http://johnmeister.com/linux/Scripts/run-now-on-all-hosts.sh.html 


#!/bin/bash # vicp.sh - created 2015-09-01 jm - wrapper script # updated 2018-06-12 to add a "BAK" directory in local path, be sure to chmod 755 ################################################################################## # if desired, add an archive Directory path to copied file name # /usr/bin/cp $1 $1.before-`date +'%Y%m%d-%H%M'` # BEFORE # /usr/bin/cp $1 BAK/$1.after-`date +'%Y%m%d-%H%M'` # AFTER ################################################################################## if [ ! -d BAK ] # tests for directory, if not, then mkdir... # echo in-loop # commented to test loop - had a problem with special characters then mkdir BAK # echo dirbak # comment to check process.. a space, a tick and an apostrophe... echo "history directory made" fi cp $1 BAK/$1.before-`date +'%Y%m%d-%H%M'` # ls -al BAK # added this to see that the loop worked (cutting and pasting has issues!) vi $1 cp $1 BAK/$1.after-`date +'%Y%m%d-%H%M'` ##################################################################################
Working as a Systems Administrator you do NOT always have the luxury of advanced tools and applications... while there are good tools available in UNIX and Linux and likely Cygwin, they are NOT installed by default, and if they are they may not be configured. You can't trust them. So, yeah, I know there are tools, but they're not always in YOUR tool box when you're dealing with bare metal or on "other" systems. My focus in training and sharing is on what I've used and can remember from the real world. If you're using a system all the time in a stable environment you may be able to add the "better" tools... or do what I did and "grow" my scripts to do more things. These scripts work because I've used them to manage enterprises. I make them work, then build a web page to share them... and... cut and paste into any environment I might find myself... because trying to remember some of these things is a challenge... syntax can be complex. Simple Always Works. I'm pretty simple, and always seem to be working... ...

Session #129 - June 8, 2018 - GPARTED & SuSE 42.3 update; CygWin/BASH continued

Session #129- June 1, FRIDAY: SIMPLE ALWAYS WORKS!    

SITUATION:  Toshiba R600 Portege laptop, 500GB drive - SuSE 42.2 - EOL.  Needed to update, and low on space.
RESOURCES:  DVD with SuSE 42.3 Leap; 1TB SSD from recently deceased laptop; and Gparted on a USB stick.

Thought I would need to use EZGig or Clonezilla... EZgig was version 2... out of date, failed... downloaded v6, but 
it required the hardware device... long ago deceased and disposed... Tried Clonezilla... bad disk... dug out my USB stick with 
Gparted that we've used in prior session to rescue systems, and dd for cloning:
http://johnmeister.com/linux/Notes/Gparted-for-Recovery/ALL.html
http://johnmeister.com/linux/FileSystems/rescuing-failing-drive-w-dd.html

http://johnmeister.com/linux/FileSystems/Gparted-clone-expand-partition-n-upgrade-SuSE.html

But I forgot that one can "copy" partitions with Gparted...  so I tested it, and it worked.
Getting the system to boot to the correct drive was a challenge... had to be careful of /dev/sda as it had the good SuSE 42.2 with MATE... attached to a USB SATA dual drive unit was the target 1TB SSD (which had a full up MINT/MATE environment on it with a corrupted MATE Menu, the files had already been copied to another 1TB SATA drive). And then of course the USB stick with Gparted (Debian Linux), which was what I needed to boot to. It took a few tries to get the correct boot device, but once Gparted started it was all good. What I ended up doing was removing the 1TB drive from the USB/SATA "toaster" and that allowed the system to find the USB stick, once the Gparted screen appeared and BEFORE it loaded its kernel, I reinserted the 1TB drive. If you boot into the Gparted kernel without both drives (source and target) in place it won't work, it needs to be registered in the kernel when started.

The process steps were:

1) boot into Gparted (disk or thumbdrive) - make sure all disks are attached
2) identify drives, e.g.

    /dev/sda1  swap 10GB
    /dev/sda2  root 138GB  (SuSE 42.2)
    /dev/sda3  home 318GB

    /dev/sdb1  swap 8GB
    /dev/sdb2  root 989GB  (Mint/Mate)

    /dev/sdc1  root 512M  (Gparted/Debian)

3) select /dev/sdb - create partition (gpt) 
    - yes, delete all partitions ; apply

4) select /dev/sda1 - copy ; select /dev/sdb, paste 
    (leave the same size, 10GB for 8GB of memory is adequate for swap)

5) select /dev/sda2 - copy ; select /dev/sdb, paste 
    (leave the same size, 138GB for the OS is adequate)

6) select /dev/sda3 - copy ; select /dev/sdb, paste 
    - AND expand to fill the drive 998GB or whatever.

7) after being completely certain you did this right, click APPLY and wait... 
    for what may be a long time...

8) once complete reboot the system verify it's ok... 
    this is where the real work begins.

9) disassemble the laptop and swap the drives, making sure you keep them straight.
    remember, the original drive is still good and correct.

10) swap drives (YMMV), boot up to the cloned drive... good luck, you'll need it.
    - at this point things can get very complicated...
    - if your system used UUID instead of KERNEL devices (i.e. some cryptic string, vs. /dev/sda1)
       you will have a problem... the UUID of the device you cloned will be different...
       that means you'll have to edit /etc/fstab and likely go into /boot and grub.cfg and so on.
    - if you can't boot at this point, and you're smart enough to use OpenSuSE, you have options.

11) Assuming you were bright enough to use OpenSuSE, you have options.
    - you can fiddle with files, or boot up to the install media.
    - when booting to the SuSE Install disk, select upgrade (if you select install the disk 
        is smart enough to realize that it's already installed, and allow you to upgrade,
        which isn't something that often happens with other distros... of course, YMMV.)
    - when doing the upgrade I always select at least one or two additional packages,
        by doing this I make sure the system is changed so that the boot features are also changed.

12) the system will do it's upgrade and usually fix the booting issue.
    - if not either dig into grub or put the old drive back in.
    - once the old drive is in, modify grub.cfg and /etc/fstab to show KERNEL drivers, not UUID,
      or you can get fancy and fake out the UUID on your system to make the new drive appear to be
      the old drive... confused?  yeah, me too... so I ignore the warning about kernel names and
      go that way... however, if you swap drives around on your bus by accident, that will BITE you! be careful. 
    once the system boots up you can restore your old repos (non-oss) and make sure everything is working.
    (I forgot to save my old repos... needed non-oss for virtualbox and VLC... not hard to add back in.).

This was way more fun than CygWin... 


Session #128 - May 11, 2018 - BASH Commands in CYGWIN - continued

Session #128- May 11, FRIDAY: SIMPLE ALWAYS WORKS!    just not always as fast as we'd like...              
 
Plan to continue working with CygWin and using it for parsing information;
                while not as fast as native Linux, it's cost-effective (as in free and no additional hardware required) and convenient. 
I recognize that this tool is the best way of getting power users familiar with regular expressions and Linux!
 
I have been testing various configurations and working with others to learn more about this tool.  I was able to get ssh working in Cygwin and am
researching it further.  The plan is to see if we can ssh between Microsoft systems!   This will greatly help me manage my computer lab!!!!
Also considering making X-windows work in Cygwin...  lots to explore with this tool, one computer, one OS, but many tools! 
 
The focus on the sessions will be about the commands and scripts, 
    and I will comment on how these commands relate to the Linux environment where they differ.
 
Using CYGWIN for these sessions provides the LOWEST COMMON DENOMINATOR to help folks learn regular expressions and Linux commands.

Session #127 - April 27, 2018 - BASH Commands in CYGWIN - continued

remember:  SIMPLE ALWAYS WORKS!  Find the lowest common denominator in your setup and tools, master them.

April 27, FRIDAY - Linux at Lunch 
Linux at Lunch - Session #127 - Real World Linux Commands - in CYGWIN continued.
Working through the list of commands to see which are built-in, which are commands and which are not there.

THIS WEEK: parsing info from the Microsoft "systeminfo" command, and also some CSV data from Active Directory.
    Will identify details about a system and create a CSV file to input into Excel, or OpenOffice, or LibreOffice.

 http://johnmeister.com/linux/SysAdmin/system-info-to-CSV-sysadmin.sh.html 

 http://johnmeister.com/linux/Notes/Real-world-Linux-Commands.html


Session #126 - April 13, 2018 - BASH Commands in CYGWIN - continued

remember:  SIMPLE ALWAYS WORKS!  Find the lowest common denominator in your setup and tools, master them.

April 13, FRIDAY - Linux at Lunch 
Linux at Lunch - Session #126 - Real World Linux Commands - in CYGWIN continued.
Working through the list of commands to see which are built-in, which are commands and which are not there.
 http://johnmeister.com/linux/Notes/Real-world-Linux-Commands.html


Session #125 - April 6, 2018 - BASH Commands in CYGWIN

remember:  SIMPLE ALWAYS WORKS!  Find the lowest common denominator in your setup and tools, master them.

April 6, FRIDAY - Linux at Lunch 
Linux at Lunch - Session #125 - Real World Linux Commands - in CYGWIN 


http://johnmeister.com/linux/Notes/Real-world-Linux-Commands.html


Session #124 - March 30, 2018 - building interoperability

remember:  SIMPLE ALWAYS WORKS!  Find the lowest common denominator in your setup and tools, master them.

March 30, FRIDAY - Linux at Lunch 
Linux at Lunch - Session #124 - building a base of interoperability with Linux

Two topics this week:
Discuss a variety of file management practices to ensure interoperability.
Using Cygwin on Windows and working with Linux or Mac with the same files.
Moving files between systems on a network, via email or via thumbdrive.
Setting up a thumbdrive to work on Mac, Linux and Microsoft.
Using rsync, winscp, putty, Exceed, VNC, VM's, ssh, scp, config mgt and backup tools.
Linux commands involving  ssh setup, rysnc, scp, and creating a function or alias to make backup copies of files.

Useful links for learning more about Linux:
http://johnmeister.com/linux/Overview/
http://johnmeister.com/linux/Commands/commands-bash.html
http://johnmeister.com/linux/vi/ - the only editor worth learning if you're going to work at the command line in Linux/UNIX
 http://johnmeister.com/linux/Intro-to-Linux/lab-exercise-1.html
http://johnmeister.com/linux/Intro-to-Linux/lab-exercise-2.html
http://johnmeister.com/linux/Commands/Man-Pages-Lab/create-man-txt.sh.html
http://johnmeister.com/linux/Commands/vi-examples-of-sed-strings.html
http://johnmeister.com/linux/Commands/MAN/LPI-commands.html
http://johnmeister.com/linux/Intro-to-Linux/Special-Characters.pdf
http://johnmeister.com/linux/Intro-to-Linux/Slides/ALL.html
http://johnmeister.com/linux/Intro-to-Linux/OLDER-NOTES/
http://johnmeister.com/linux/Scripts/Engineering/
for home, BASH Win10 - older info: http://johnmeister.com/linux/Microsoft/Win10-BASH/
https://www.smashwords.com/books/search?query=John+Meister  link to e-books
http://johnmeister.com/linux/The_Art_of_Linux_System_Administration.html
http://www.oreilly.com/pub/au/6963 videos:  http://shop.oreilly.com/product/0636920050209.do


Session #123 - March 23, 2018 - /bin vs. /usr/bin; systemd (systemctl and journalctl)

remember:  SIMPLE ALWAYS WORKS!  Find the lowest common denominator in your setup and tools, master them.

Two topics this week:
1) FHS (Filesystem Hierarchy Standard) info related to setting up .bashrc and related VIM issues:

http://johnmeister.com/linux/FileSystems/Linux-bashrc-vim-differences.jpg

2) "very" quick refresher on systemd, logs, and services:
http://johnmeister.com/linux/Notes/systemd-info.html

3) open talk on future sessions - thinking going through very basic regular expression use might be helpful... thoughts?



Session #122 - March 9, 2018 - wget, paste and reg ex tools

Linux at Lunch - Session #122 - March 9, 2018 - wget, paste, sed, and other useful tools in practice
          Linux at Lunch sessions EVERY OTHER WEEK during 2018:
            March 9 (#122), March 23 (#123); April 6 (#124), April 20 (#125);
            May 4 (#126), May 18 (#127); June 8 (#128); July 27 (#129);
            August 10 (#130), Aug 24 (#131); Sep 7 (#132), Sep 21 (#133);
            Oct 19 (#134); Nov 16 (#135) - END OF SERIES (? mmv...).

Updated a long term project, you can use "wget" to copy this entire directory to a local drive, 
allowing you to view it without a network.  TO DOWNLOAD:

   wget -r -np -nd -P RKNG http://johnmeister.com/tech/RKNG

alternate method:
mkdir RKNG ; cd RKNG ; wget -r -np -nd  http://johnmeister.com/tech/RKNG

the -r is for recursive (goes down the path listed, 
    can be limited with a "-l # " , e.g. "  -l 2 " for two levels down

the  -np is for "no parent"  (doesn't go back up to the root file system - only gets the path specified)

the  -nd is to prevent copying the entire directory structure from the top down... FQDN and path...

the  -P RKNG - is the designated path to copy to.  
        use whatever name you prefer, or mkdir "name" ; cd "name" 

That project made use of a lot of sed, grep, awk, vi, perl and vi edits.  
It is a self-contained directory with no external links, 
using only internal html references using name and href designations.

Useful links for learning more about Linux:
 http://johnmeister.com/linux/Overview/
http://johnmeister.com/linux/Commands/commands-bash.txt
http://johnmeister.com/linux/vi/ - seriously vi is the only editor worth 
        really learning if you're going to work at the command line in Linux/UNIX
 http://johnmeister.com/linux/Intro-to-Linux/lab-exercise-1.html
http://johnmeister.com/linux/Intro-to-Linux/lab-exercise-2.html
http://johnmeister.com/linux/Commands/Man-Pages-Lab/create-man-txt.sh.txt
http://johnmeister.com/linux/Commands/vi-examples-of-sed-strings.html
http://johnmeister.com/linux/Commands/MAN/LPI-commands.html
http://johnmeister.com/linux/Intro-to-Linux/Special-Characters.pdf
http://johnmeister.com/linux/Intro-to-Linux/Slides/ALL.html
http://johnmeister.com/linux/Intro-to-Linux/OLDER-NOTES/
http://johnmeister.com/linux/Scripts/Engineering/
for home, BASH Win10 - older info: http://johnmeister.com/linux/Microsoft/Win10-BASH/
  
https://www.smashwords.com/books/search?query=John+Meister  link to e-books
http://johnmeister.com/linux/The_Art_of_Linux_System_Administration.html
 http://www.oreilly.com/pub/au/6963 

giving some thought to the following option, not completely sure yet:
 < esc >:wq!  

Session #121 - February 23, 2018 - FRIDAY - cygwin scripting environments


Linux at Lunch - Session #121 - Feb 23, 2018 - cygwin, Linux 101, CSV, exiftool
Session #121 February 23, 2018 - FRIDAY - Linux at Lunch -  session opens 15 minutes before...
1) cygwin  - basic steps to setup
            (for home, BASH Win10 - older info: 
            http://johnmeister.com/linux/Microsoft/Win10-BASH/Win10-BASH-install.html)
2) Linux 101 - over view of commands
3) CSV - script showing conversion of system info to CSV to Excel
    part of the Video series:  http://johnmeister.com/linux/The_Art_of_Linux_System_Administration.html
    Script To Create System Info Spreadsheet 06m23s 0808-SA-tools managing the system resources 
    using a spreadsheet
4) exiftool and CSV
-------------------------------------------
This week we'll talk about installing Cygwin, show the basic setup with .bashrc and .vimrc,
 http://johnmeister.com/linux/basic-setup-for-BASH.html
 and then discuss commands briefly from the Overview page linked below, kind of a Linux 101, 
basically just talking about the various grouping of commands and then moving into some scripts 
to show the commands in action.  http://johnmeister.com/linux/Overview/
http://johnmeister.com/linux/Overview/LinuxOverview.pdf

ADDITIONAL INFO:
http://johnmeister.com/linux/Intro-to-Linux/lab-exercise-1.html
http://johnmeister.com/linux/Intro-to-Linux/lab-exercise-2.html
http://johnmeister.com/linux/Commands/vi-examples-of-sed-strings.html
http://johnmeister.com/linux/Commands/MAN/LPI-commands.html
http://johnmeister.com/linux/Intro-to-Linux/Special-Characters.pdf
http://johnmeister.com/linux/Intro-to-Linux/Slides/ALL.html
http://johnmeister.com/linux/Intro-to-Linux/OLDER-NOTES/

Then will walk through the CSV script, and will show how to read in and out of excel with 
    CSV. (comma separated values).
http://johnmeister.com/linux/SysAdmin/system-info-to-CSV-sysadmin.sh.html
from :  http://johnmeister.com/linux/The_Art_of_Linux_System_Administration.html

Script To Create System Info Spreadsheet 06m23s 0808-SA-tools managing the system resources 
using a spreadsheet.  Have an idea for a useful script for my photography, will run the exiftool, 
filter out specific details and place them in CSV format with the server link and add them to Excel.

http://johnmeister.com/linux/Scripts/extract-base-EXIF-info.sh.html

  I can then open that file in OpenOffice and create a PDF or an HTML page for navigation.
Always interested in which lens I used, exposures and so on.
Session #121 February 23, 2018 - FRIDAY - Linux at Lunch -  session opens 15 minutes before...



Session #120 - February 16, 2018 - FRIDAY - scripting environments

discussion of scripts

a couple of scripts and PERFORMANCE



Session #119 - February 9, 2018 - FRIDAY - updated Engineering Week presentation and adventures with AWK
UPDATED the material - will go through the new powerpoint
During the Engineering week presentation will provide an overview of Linux with links to more information, and then describe a time savings of over 182 manhours using Regular Expressions in an engineering analysis. Presentation is scheduled for Tuesday, 20 February 2018 from 1300-1400 hrs at the Bomarc Complex.

How to use Regular Expressions to save 13 hours of Microsoft Excel sorting time with 5 SECONDS in a Linux Shell. Seriously... there are pictures... kind of... (btw, I love spreadsheets... but... less than FIVE SECONDS!!!)
(using cygwin on Win7, pure Linux about 0.2 seconds)
Then we will discuss some simple features of AWK; the ones we can remember.


Session #118 - January 26, 2018 - FRIDAY - dry run of Engineering Week presentation
 During the Engineering week presentation will provide an overview of Linux with links to more information,
and then describe a time savings of over 182 manhours using Regular Expressions in an engineering analysis.

Presentation is scheduled for Tuesday, 20 February 2018 from 1300-1400 hrs at the Bomarc Complex. 

How to use Regular Expressions to save 13 hours of Microsoft Excel sorting time with 5 SECONDS in a Linux Shell. Seriously... there are pictures... kind of... (btw, I love spreadsheets... but... FIVE SECONDS!!!)


Session #117 - January 19, 2018 - FRIDAY - BASH command precedence
Discusson on command precedence and useful HTML features as demonstrated in his page:
There are two major lessons: 1) path considerations, special characters, built-ins, commands and aliases. see prior material at: bashrc-n-history-details.html see prior material at: basic-setup-for-BASH.html see example .bashrc file: bashrc-basic.html 2) practical use of HTML features, e.g. code for buttons used to hide material until expanded.




Session #116 - January 12, 2018 - FRIDAY - using command line tools to build a web page
############################################### 
1)  FILES TO MERGE INTO PARALLEL COLUMNS: used excel to create list  
      (moved columns in excel), copy/paste
 
    http://johnmeister.com/bible/ReadInOneYear/one-year-plan.txt
  - Added leading zeros so all numbers were 3 digits for sorting, underscores for spaces.
  - Global replace in vi "touch" at the front of the line, save and: sh ./listofnames
   created empty files, ls > Daily-Passages.html, edited and:
 
    http://johnmeister.com/bible/ReadInOneYear/Daily-Passages.html

Took the list and extracted only the chapters using cut to get the ranges needed.
   ls Day_212_Jul_07_31_Isaiah_57-59.html | cut -c 25-29  
        # tested placement of verses... off by 1
   ls Day_212_Jul_07_31_Isaiah_57-59.html | cut -c 26-30
   ls | cut -c 26-30 > create-files.sh 
        #  then I took the list of 365 days and created the list of chapters
--> echo Day_212_Jul_07_31_Isaiah_57-59.html | cut -c 26-30
57-59       #  so, VS="57-59" and would be created for each section. 
############################################### 
2)  created copies: kjv, nas, greek 
   cp gnt.txt g$VS.txt ; cp KJVb.txt k$VS.txt; cp NAS-NT.txt n$VS.txt 
############################################### 
3) vi - edit out all but chapters for reading: 
    vi file, /Mark 15, k, 1G, /Mark 17,dG,:wn... last,:wq
############################################### 
3) check lines on all three files (using cat and wc -l)
cat g$VS.txt | wc -l ; cat k$VS.txt | wc -l ; cat n$VS.txt | wc -l
####  time saver:  cat g*.txt | wc -l ; cat k*.txt | wc -l ; cat n*.txt | wc -l
############################################### 
4) copy empty file from web directory  (would do this as the last step in the script)
   DAY1 would be file in edit, DAYN is the next day.
############################################### 
5) paste tags and text together into file
  paste -d '\n' tags1.txt k$VS.txt tags2.txt n$VS.txt tags3.txt g$VS.txt tags4.txt > $DAY1
#### manual / time saver mode (grabbed from history, command line complete of Day...): 
#  paste -d '\n' tags1.txt k*.txt tags2.txt n*.txt tags3.txt g*.txt tags4.txt \
#                    > Day_294_Oct_10_21_Luke_1-2.html
############################################### 
6) vi $DAY1 - deleted empty rows at end;  :r FILE-BASE.txt ; type day and verses; save
############################################### 
7) scp file to server and mv to web directory
############################################### 
8) repeat for all 365 days: 1,189 chapters; 31,102 verses and 781,621 words (YMMV)

########################################## script for part of the process: ########################################## #!/bin/bash ############# DAY1="Day_293_Oct_10_20_Mark_15-16.html" ; VS="15-16" DAYN="Day_294_Oct_10_21_Luke_1-2.html" ############# cp gnt.txt g$VS.txt ; cp KJVb.txt k$VS.txt; cp NAS-NT.txt n$VS.txt vi ?$VS.txt cat g$VS.txt | wc -l ; cat k$VS.txt | wc -l ; cat n$VS.txt | wc -l read # safety net - edit files in a different shell - hit enter when ready #### THE MAIN COURSE: tags have html formatting including colors paste -d '\n' tags1.txt k$VS.txt tags2.txt n$VS.txt tags3.txt g$VS.txt tags4.txt > $DAY1 ########### cat FILE-BASE.txt >> $DAY1 # I found that trimming the file of empty lines first was better vi $DAY1 ; ls -al ; echo $DAY scp $DAY1 server:/home/luser/website/ReadInOneYear/ mv $DAY1 ../ReadInOneYear/ mv ?$VS.txt hold-tmp # a safety net - just in case... (used it once so far) ############# echo "next day" cp ../10_Oct/$DAYN . # cp ../07_Jul/Day_191_Jul_07_10_Ecclesiastes_1-4.html . # manual steps for smaller sections echo "next $DAYN" ; echo "====================" ###################################################### could queue up more day # DAY1="Day_286_Oct_10_13_Mark_1-3.html" ; VS="1-3" # DAYN="Day_287_Oct_10_14_Mark_4-5.html" ######################################################



2017 - click for all sessions


Session #115 - December 15, 2017 - FRIDAY - using find and perl
note on html: color for cell above is "ivory" (rrggbb), font color is "black"
(red, green, and blue): colors between 0 and 255 (hex values= ff). 255,0,0 = red ; 0,255,0 = green ; 0,0,255 = blue
(can use names too...)

NOTE: LAST SESSION OF 2017
http://johnmeister.com/linux/Intro-to-Linux/OLDER-NOTES/One-Hour-Linux-Sessions-2014-2017.html

using find and perl to update HTML pages for URL correction and consolidation of domains

http://johnmeister.com/linux/SysAdmin/Using-find-n-perl-to-manage-URLs.html
also: time, grep, col, and Apache: /etc/apache2/errors.conf and creating a missing.html page

Session #114 - December 8, 2017 - FRIDAY - Oracle VirtualBox and LVM
note on html: color for cell above is "yellow" (rrggbb), font color is "black"
(red, green, and blue): colors between 0 and 255 (hex values= ff). 255,0,0 = red ; 0,255,0 = green ; 0,0,255 = blue
(can use names too...)


Expanding the capactity of a dynamic vdi disk in Oracle's VirtualBox

Built a default Linux 64bit VM with Centos, however, needed more space, 
   had to increase the capacity from the default of 8GB to about 14GB. 
https://www.virtualbox.org/

http://johnmeister.com/linux/SysAdmin/Virtualization/VM_setup_SuSE_13.2/ALL.html
  
Steps to increase capacity of a vdi drive in VirtualBox 
        (Linux or MacOSx host) at the COMMAND LINE:

-->sudo VBoxManage modifyhd /home/luser/VirtualBox_VMs/Centos-Lab/Centos_Lab.vdi --resize 14480
 0%...10%...20%...30%...40%...50%...60%...70%...80%...90%...100%

Once that was done, checked settings in VirtualBox;  
        was good... but local drive in VM still not expanded.

FROM INSIDE THE VM, at the command line:  
    pvs ; pvdisplay ; vgs ; vgdisplay; lvs ; lvdisplay  
        # displays system values
    fdisk /dev/sda      
        #  create a partition in the "expanded" virtual drive
        p, n, p, 3, t, 82e, w
    pvcreate /dev/sda3    ; pvs; vgs    
        # create the physical volume, check it
   vgextend centos_centos7-vm /dev/sda3  
        # extend the existing volume group into the expande physical drive
   vgs ; pvscan ; lvdisplay ; df -h; vgdisplay
   lvextend /dev/mapper/centos_centos7--vm-root /dev/sda3 
        # extend the logical volume into the expanded volume group
   df -h . ; vgs ; lvs 
   resize2fs /dev/mapper/centos_centos7--vm-root  
       ## FAILURE - bug in RH LVM and/or XFS  (should have extended the file system)
## FAILURE - bug in RH LVM and/or XFS
--> resize2fs /dev/centos_centos7-vm/root   (FAILURE IN RH-type system and xfs)
resize2fs 1.42.9 (28-Dec-2013)
resize2fs: Bad magic number in super-block while trying to open /dev/centos_centos7-vm/root
Couldn't find valid filesystem superblock.

FIX:
--> xfs_growfs  /dev/centos_centos7-vm/root

http://johnmeister.com/linux/FileSystems/ADD-6GB-to-vdi-CentOS-VM.html

process:  Order of precendence and logic:   - PHYSICAL   - VOLUME  - LOGICAL 
PHYSICAL VOLUME(s): pvs; pvdisplay; pvresize
VOLUME GROUP(s): vgs; vgdisplay; vgextend        Add physical volumes to a volume group
LOGICAL VOLUME(s): lvs ; lvdisplay; lvextend        Add space to a logical volume

        http://johnmeister.com/linux/FileSystems/setup-LVM.html

        http://johnmeister.com/linux/FileSystems/lvm-commands.html


Session #113 - December 1, 2017 - FRIDAY - RECAP of 3 years of Linux at lunch!
note on html: color for cell above is "fedcba" (rrggbb), font color is "black"
(red, green, and blue): colors between 0 and 255 (hex values= ff). 255,0,0 = red ; 0,255,0 = green ; 0,0,255 = blue
(can use names too...)


recap of 3 years worth of sessions

http://johnmeister.com/linux/Intro-to-Linux/session-list-2017.html

DETAILS 2014-2017 Linux sesssions

SESSION #2 - December 3, 2014 - SAW:"simple always works"

provided link to exercise #1,
discussed environment (.bashrc), path, chmod, discussed "SAW" -
SAW:"simple always works",
History files viewed with sort, uniq, and how to create notes for reference:
ls, sort, grep, uniq, wc -l
(after using a command, use history recall, add echo and quotes around it,
and then append to ~/bin/cool-commands.txt e.g.

echo "ls *.jpg > ALL.html ; \
perl -pi -e 's/(.*)/<img src="$1"><BR>$1<HR>/g' ALL.html ; \
perl -pi -e 's/<img src=""><BR><HR>//g' ALL.html ; \
cat ALL.html" >> ~/bin/cool-commands.txt

----------------------------------------------------------------------------
Links: http://johnmeister.com/linux/Intro-to-Linux/lab-exercise-1.html
(do this on your system, use "script exercise-1.raw",
when finished type exit, then cat exercise-1.raw | col -b > exercise-1.txt
# then edit in vi and save)
http://johnmeister.com/linux/Scripts/chksys.sh.html
(use the vi editor to create and run on your system)
http://johnmeister.com/linux/Scripts/man-page-create-textfiles.sh.txt
(mkdir and cd LAB, then create this script in vi and run)
http://johnmeister.com/linux/Commands/grep-awk-uniq.html
(the output of the script above is needed to use the commands,
YMMV, counts may be different)


SESSION #1 - November 26, 2014 - OVERVIEW

	 provided overview of Linux using the pdf: 
        http://johnmeister.com/linux/Overview/LinuxOverview.pdf  
	5 basic commands: man, ls (ls -al, ls -Al), cd, pwd, more 
	discussed .bashrc and showed a few script examples, talked about "script"
----------------------------------------------------------------------------
Links:	http://johnmeister.com/linux/Overview/LinuxOverview.pdf    
        (print out and use as a guide)
	http://johnmeister.com/linux/Intro-to-Linux/Special-Characters.pdf 
        (print out and keep as a guide)
	http://johnmeister.com/linux/Notes/bashrc-the-dotfile.html  
        (copy and build your own .basrhc for Cygwin or Linux)
	http://johnmeister.com/linux/Scripts/chksys.sh.html  
	http://johnmeister.com/linux/Intro-to-Linux/lab-exercise-1.html  
        (use the commands and the vi editor)


THE CORE MATERIAL:The power of the command line - Simply Linux: Basics will work our way through the Linux book (Simply Linux: Basics) under construction based on http://johnmeister.com/linux/Overview/ http://johnmeister.com/linux/Overview/LinuxOverview.pdf (print out and use as a guide) http://johnmeister.com/linux/Overview/Linux-PowerPoint-2004-overview.pdf http://johnmeister.com/linux/Notes/Real-world-Linux-Commands. http://johnmeister.com/linux/Intro-to-Linux/Special-Characters.pdf (print out and use as a guide) http://johnmeister.com/linux/Notes/bashrc-the-dotfile. (copy and build your own .basrhc for Cygwin or Linux)

ONE HOUR Linux SESSIONS



ebooks and video series by john

Simply Linux: Basics Linux Tackles Microsoft Using BASH on Windows 10
Practical Suggestions for Microsoft Windows
 Full Size Jeep Buyer's Guide
the art of Linux sys admin
the Art of Linux SysAdmin
john's publications (click on cover for further info)


JohnMeister.com Today's Date:

Simply Linux: Basics  Full Size Jeep Buyer's Guide Using BASH on Windows 10
Practical Suggestions for Microsoft Windows
Linux Tackles Microsoft
Video Course:
The Art of Linux System Administration, and a
Study Guide for the LPIC-2 Certification Exams.

-- O'Reilly Media author info
FULL SIZE JEEP

Buyer's Guide

SJ model Jeeps
"Jeep is America's
only real sports car."
-Enzo Ferrari


Mercedes, VW, and other Diesels
Nikon cameras
general tech info
AMSOIL product guide,
or, AMSOIL web, or 1-800-956-5695,
use customer #283461

Amsoil dealer since 1983

purchase AMSOIL and have it
installed locally in WA at:

- Fleet Services 425.355.4440 - Everett
- Midway Auto 360.668.7111 - Clearview/Snohomish
- Northland Diesel 360.676.1970 - Bellingham