| ==Phrack Magazine== |
|
|
| Volume Four, Issue Forty-Two, File 11 of 14 |
|
|
|
|
| ################################################### |
| # The Paranoid Schizophrenics Guide to Encryption # |
| # (or How to Avoid Getting Tapped and Raided) # |
| ################################################### |
|
|
| Written by The Racketeer of |
| The /-/ellfire Club |
|
|
|
|
| The purpose of this file is to explain the why and the how of Data |
| Encryption, with a brief description of the future of computer security, |
| TEMPEST. |
|
|
| At the time of this issue's release, two of the more modern software |
| packages use encryption methods covered in this article, so exercise some of |
| your neurons and check into newer releases if they are available. Methods |
| described in this file use PGP, covering an implementation of Phil Zimmermann's |
| RSA variant, and the MDC and IDEA conventional encryption techniques by using |
| PGP and HPACK. |
|
|
| -------------------- |
| WHY DATA ENCRYPTION? |
| -------------------- |
|
|
| This isn't exactly the typical topic discussed by me in Phrack. |
| However, the importance of knowing encryption is necessary when dealing with |
| any quasi-legal computer activity. I was planning on starting my series on |
| hacking Novell Networks (so non-Internet users can have something to do), but |
| recent events have caused me to change my mind and, instead of showing people |
| how to get into more trouble (well, okay, there is plenty of that in this file |
| too, since you're going to be working with contraband software), I've opted |
| instead to show people how to protect themselves from the long arm of the Law. |
|
|
| Why all this concern? |
|
|
| Relatively recently, The Masters of Deception (MoD) were raided by |
| various federal agencies and were accused of several crimes. The crimes they |
| did commit will doubtlessly cause more mandates, making the already |
| too-outrageous penalties even worse. |
|
|
| "So?" you might ask. The MoD weren't exactly friends of mine. In fact, |
| quite the contrary. But unlike many of the hackers whom I dealt with in the |
| "final days" prior to their arrest, I bitterly protested any action against the |
| MoD. Admittedly, I followed the episode from the beginning to the end, and the |
| moral arguments were enough to rip the "Hacker World" to pieces. But these |
| moral issues are done, the past behind most of us. It is now time to examine |
| the aftermath of the bust. |
|
|
| According to the officials in charge of the investigation against MoD |
| members, telephone taps were used to gain evidence against members |
| successfully. All data going in and out of their house was monitored and all |
| voice communications were monitored, especially between members. |
|
|
| So, how do you make a line secure? The party line answer is use of |
| effective encryption methods. |
|
|
| Federal investigative agencies are currently pushing for more |
| technological research into the issue of computer security. All of the popular |
| techniques which are being used by hackers today are being used by the |
| government's R&D departments. |
|
|
| Over the course of the last 5 years, I've watched as the U.S. |
| Government went from a task force of nearly nil all the way to a powerful |
| marauder. Their mission? Unclear. Regardless, the research being |
| accomplished by federally-funded projects dealing with the issues of computer |
| security are escalating. I've personally joined and examined many such |
| conferences and have carefully examined the issues. Many of these issues will |
| become future Phrack articles which I'll write. Others, such as limited-life |
| semiconductors and deliberate telephone line noise sabotage caused by ACK |
| packet detections in order to drive telecommunication costs higher, are sadly |
| unpreventable problems of the future which won't be cured by simple awareness |
| of the problem. |
|
|
| They have different names -- Computer Emergency Response Team (CERT), |
| Computer Assisted Security Investigative Analysis Tool (FBI's CASIAT), the |
| Secret Service's Computer Fraud Division, or the National Computer Security |
| Center (NSA's NCSC). Scores of other groups exist for every network, even |
| every operating system. Their goal isn't necessarily to catch hackers; their |
| goal is to acquire information about the act of hacking itself until it is no |
| longer is a problem. Encryption stands in the way. |
|
|
| Computer Security is literally so VAST a concept that, once a person |
| awakens to low-level computer mechanics, it becomes nearly impossible to |
| prevent that person from gaining unauthorized access to machines. This is |
| somewhat contradictory to the "it's all social engineering" concept which we |
| have been hearing about on Nightline and in the papers. If you can't snag them |
| one way though, you can get them another -- the fact is that computers are |
| still too damn vulnerable these days to traditional hacking techniques. |
|
|
| Because of the ease of breaking through security, it becomes very |
| difficult to actually create an effective way to protect yourself from any form |
| of computer hacking. Look at piracy: they've tried every trick in the book to |
| protect software and, so far, the only success they have had was writing |
| software that sucked so much nobody wanted a copy. |
|
|
| Furthermore, totally non-CPU related attacks are taking place. The |
| passing of Anti-TEMPEST Protection Laws which prevent homes from owning |
| computers that don't give off RF emissions has made it possible for any Joe |
| with a few semesters of electrical engineering knowledge to rig together a |
| device that can read what's on your computer monitor. |
|
|
| Therefore: |
|
|
| Q: How does a person protect their own computer from getting hacked? |
|
|
| A: You pretty much can't. |
|
|
| I've memorized so many ways to bypass computer security that I can |
| rattle them off in pyramid levels. If a computer is not even connected to a |
| network or phone line, people can watch every keystroke typed and everything |
| displayed on the screen. |
|
|
| Why aren't the Fedz using these techniques RIGHT NOW? |
|
|
| I can't say they are not. However, a little research into TEMPEST |
| technology resulted in a pretty blunt fact: |
|
|
| There are too many computer components to scan accurately. Not the |
| monitor, oh no! You're pretty much fucked there. But accessories for input |
| and output, such as printers, sound cards, scanners, disk drives, and so |
| forth...the possibility of parallel CPU TEMPEST technology exists, but there are |
| more CPU types than any mobile unit could possibly use accurately. |
|
|
| Keyboards are currently manufactured by IBM, Compaq, Dell, Northgate, |
| Mitsuma (bleah), Fujitsu, Gateway, Focus, Chichony, Omni, Tandy, Apple, Sun, |
| Packard-Bell (may they rot in hell), Next, Prime, Digital, Unisys, Sony, |
| Hewlett-Packard, AT&T, and a scattering of hundreds of lesser companies. Each |
| of these keyboards have custom models, programmable models, 100+ key and < 100 |
| key models, different connectors, different interpreters, and different levels |
| of cable shielding. |
|
|
| For the IBM compatible alone, patents are owned on multiple keyboard |
| pin connectors, such as those for OS/2 and Tandy, as well as the fact that the |
| ISA chipsets are nearly as diverse as the hundreds of manufacturers of |
| motherboards. Because of lowest-bid practices, there can be no certainty of |
| any particular connection -- especially when you are trying to monitor a |
| computer you've never actually seen! |
|
|
| In short -- it costs too much for the TEMPEST device to be mobile and |
| to be able to detect keystrokes from a "standard" keyboard, mostly because |
| keyboards aren't "standard" enough! In fact, the only real standard which I |
| can tell exists on regular computers is the fact that monitors still use good |
| old CRT technology. |
|
|
| Arguments against this include the fact that most of the available PC |
| computers use standard DIN connectors which means that MOST of the keyboards |
| could be examined. Furthermore, these keyboards are traditionally serial |
| connections using highly vulnerable wire (see Appendix B). |
|
|
| Once again, I raise the defense that keyboard cables are traditionally |
| the most heavily shielded (mine is nearly 1/4 inch thick) and therefore falls |
| back on the question of how accurate a TEMPEST device which is portable can be, |
| and if it is cost effective enough to use against hackers. Further viewpoints |
| and TEMPEST overview can be seen in Appendix B. |
|
|
| As a result, we have opened up the possibility for protection from |
| outside interference for our computer systems. Because any DECENT encryption |
| program doesn't echo the password to your screen, a typical encryption program |
| could provide reasonable security to your machine. How reasonable? |
|
|
| If you have 9 pirated programs installed on your computer at a given |
| time and you were raided by some law enforcement holes, you would not be |
| labeled at a felon. Instead, it wouldn't even be worth their time to even raid |
| you. If you have 9 pirated programs installed on your computer, had 200 |
| pirated programs encrypted in a disk box, and you were raided, you would have |
| to be charged with possession of 9 pirated programs (unless you did something |
| stupid, like write "Pirated Ultima" or something on the label). |
|
|
| We all suspected encryption was the right thing to do, but what about |
| encryption itself? How secure IS encryption? |
|
|
| If you think that the world of the Hackers is deeply shrouded with |
| extreme prejudice, I bet you can't wait to talk with crypto-analysts. These |
| people are traditionally the biggest bunch of holes I've ever laid eyes on. In |
| their mind, people have been debating the concepts of encryption since the dawn |
| of time, and if you come up with a totally new method of data encryption, -YOU |
| ARE INSULTING EVERYONE WHO HAS EVER DONE ENCRYPTION-, mostly by saying "Oh, I |
| just came up with this idea for an encryption which might be the best one yet" |
| when people have dedicated all their lives to designing and breaking encryption |
| techniques -- so what makes you think you're so fucking bright? |
|
|
| Anyway, crypto-(anal)ysts tend to take most comments as veiled insults, |
| and are easily terribly offended. Well, make no mistake, if I wanted to insult |
| these people, I'd do it. I've already done it. I'll continue to do it. And I |
| won't thinly veil it with good manners, either. |
|
|
| The field of Crypto-analysis has traditionally had a mathematical |
| emphasis. The Beal Cipher and the German Enigma Cipher are some of the more |
| popular views of the field. Ever since World War 2, people have spent time |
| researching how technology was going to affect the future of data encryption. |
|
|
| If the United States went to war with some other country, they'd have a |
| strong advantage if they knew the orders of the opposing side before they were |
| carried out. Using spies and wire taps, they can gain encrypted data referred |
| to as Ciphertext. They hand the information over to groups that deal with |
| encryption such as the NSA and the CIA, and they attempt to decode the |
| information before the encrypted information is too old to be of any use. |
|
|
| The future of Computer Criminology rests in the same ways. The |
| deadline on white collar crimes is defaulted to about 3-4 years, which is |
| called the Statute of Limitations. Once a file is obtained which is encrypted, |
| it becomes a task to decrypt it within the statute's time. |
|
|
| As most crypto-analysts would agree, the cost in man-hours as well as |
| supercomputer time would make it unfeasible to enforce brute force decryption |
| techniques of random encryption methods. As a result of this, government |
| regulation stepped in. |
|
|
| The National Security Agency (referred to as "Spooks" by the relatively |
| famous tormenter of KGB-paid-off hackers, Cliff Stoll, which is probably the |
| only thing he's ever said which makes me think he could be a real human being) |
| released the DES -- Data Encryption Standard. This encryption method was |
| basically solid and took a long time to crack, which was also the Catch-22. |
|
|
| DES wasn't uncrackable, it was just that it took "an unreasonable |
| length of time to crack." The attack against the word "unreasonable" keeps |
| getting stronger and stronger. While DES originated on Honeywell and DEC PDPs, |
| it was rumored that they'd networked enough computers together to break a |
| typical DES encrypted file. Now that we have better computers and the cost |
| requirements for high-speed workstations are even less, I believe that even if |
| they overestimated "unreasonable" a hundredfold, they'd be in the "reasonable" |
| levels now. |
|
|
| To explain how fast DES runs these days... |
|
|
| I personally wrote a password cracker for DES which was arguably the |
| very first true high-speed cracker. It used the German "Ultra-Fast Crypt" |
| version of the DES algorithm, which happened to contain a static variable used |
| to hold part of the previous attempt at encrypting the password, called the |
| salt. By making sure the system wouldn't resalt on every password attempt, I |
| was able to guess passwords out of a dictionary at the rate of 400+ words per |
| second on a 386-25 (other methods at that time were going at about 30 per |
| second). As I understand it now, levels at 500+ for the same CPU have been |
| achieved. |
|
|
| Now this means I can go through an entire dictionary in about five |
| minutes on a DES-encrypted segment. The NSA has REAL cash and some of the |
| finest mathematicians in the world, so if they wanted to gain some really |
| decent speed on encryption, DES fits the ideal for parallel programming. |
| Splitting a DES segment across a hundred CPUs, each relatively modern, they |
| could crank out terraflops of speed. They'd probably be able to crack the code |
| within a few days if they wanted to. |
|
|
| Ten years from now, they could do it in a few seconds. |
|
|
| Of course, the proper way to circumnavigate DES encryption is to locate |
| and discover a more reliable, less popular method. Because the U.S. Government |
| regulates it, it doesn't mean it's the best. In fact, it means it's the |
| fucking lamest thing they could sweeten up and hope the public swallows it! |
| The last attempt the NSA made at regulating a standard dealing with encryption, |
| they got roasted. |
|
|
| I'm somewhat convinced that the NSA is against personal security, and |
| from all the press they give, they don't WANT anyone to have personal security. |
| Neither does the Media for that matter. |
|
|
| Because of lamers in the "Biblical Injustice Grievance Group of |
| Opposing Terrible Sacrilege" (or BIGGOTS) who think that if you violate a LAW |
| you're going to Hell (see APPENDIX C for my viewpoint of these people) and who |
| will have convinced Congress to pass ease-of-use wire taps on telephone lines |
| and networks so that they can monitor casual connections without search |
| warrants, encryption will be mandatory if you want any privacy at all. |
|
|
| And to quote Phil Zimmermann, "If privacy is outlawed, only the |
| outlaws will have privacy." |
|
|
| Therefore, encryption methods that we must use should be gathered into |
| very solid categories which do NOT have endorsement of the NSA and also have |
| usefulness in technique. |
|
|
| HOW TO USE DECENT ENCRYPTION: |
|
|
| (First, go to APPENDIX D, and get yourself a copy of PGP, latest version.) |
|
|
| First of all, PGP is contraband software, presumably illegal to use in |
| the United States because of a patent infringement it allegedly carries. The |
| patent infringement is the usage of a variant of the RSA encryption algorithm. |
| Can you patent an algorithm? By definition, you cannot patent an idea, just a |
| product -- like source code. Yet, the patent exists to be true until proven |
| false. More examples of how people in the crypto-analyst field can be assholes. |
|
|
| Anyway, Phil's Pretty Good Software, creators of PGP, were sued and all |
| rights to PGP were forfeited in the United States of America. Here comes the |
| violation of the SECOND law, illegal exportation of a data encryption outside |
| of the United States of America. Phil distributed his encryption techniques |
| outside the USA, which is against the law as well. Even though Mr. Zimmermann |
| doesn't do any work with PGP, because he freely gave his source code to others, |
| people in countries besides the United States are constantly updating and |
| improving the PGP package. |
|
|
| PGP handles two very important methods of encryption -- conventional |
| and public key. These are both very important to understand because they |
| protect against completely different things. |
|
|
| ----------------------- |
| CONVENTIONAL ENCRYPTION |
| ----------------------- |
|
|
| Conventional encryption techniques are easiest to understand. You |
| supply a password and the password you enter encrypts a file or some other sort |
| of data. By re-entering the password, it allows you to recreate the original |
| data. |
|
|
| Simple enough concept, just don't give the password to someone you |
| don't trust. If you give the password to the wrong person, your whole business |
| is in jeopardy. Of course, that goes with just about anything you consider |
| important. |
|
|
| There are doubtlessly many "secure enough" ciphers which exist right |
| now. Unfortunately, the availability of these methods are somewhat slim |
| because of exportation laws. The "major" encryption programs which I believe |
| are worth talking about here are maintained by people foreign to the USA. |
|
|
| The two methods of "conventional" encryption are at least not DES, |
| which qualifies them as okay in my book. This doesn't mean they are impossible |
| to break, but they don't have certain DES limitations which I know exist, such |
| as 8 character password maximum. The methods are: MDC, as available in the |
| package HPACK; and IDEA, as available in Pretty Good Privacy. |
|
|
| Once you've installed PGP, we can start by practicing encrypting |
| some typical files on your PC. To conventionally encrypt your AUTOEXEC.BAT |
| file (it won't delete the file after encryption), use the following command: |
|
|
| C:\> pgp -c autoexec.bat |
| Pretty Good Privacy 2.1 - Public-key encryption for the masses. |
| (c) 1990-1992 Philip Zimmermann, Phil's Pretty Good Software. 6 Dec 92 |
| Date: 1993/01/19 03:06 GMT |
|
|
| You need a pass phrase to encrypt the file. |
| Enter pass phrase: { Password not echoed } |
| Enter same pass phrase again: Just a moment.... |
| Ciphertext file: autoexec.pgp |
|
|
| C:\> dir |
|
|
| Volume in drive C is RACK'S |
| Directory of c:\autoexec.pgp |
|
|
| autoexec.pgp 330 1-18-93 21:05 |
|
|
| 330 bytes in 1 file(s) 8,192 bytes allocated |
| 52,527,104 bytes free |
|
|
| PGP will compress the file before encrypting it. I'd say this is a |
| vulnerability to the encryption on the basis that the file contains a ZIP file |
| signature which could conceivably make the overall encryption less secure. |
| Although no reports have been made of someone breaking PGP this way, I'd feel |
| more comfortable with the ZIP features turned off. This is somewhat contrary |
| to the fact that redundancy checking is another way of breaking ciphertext. |
| However, it isn't as reliable as checking a ZIP signature. |
|
|
| Although PGP will doubtlessly become the more popular of the two |
| programs, HPACK's encryption "strength" is that by being less popular, it will |
| probably not be as heavily researched as PGP's methods will be. Of course, by |
| following PGP, new methods of encryption will doubtlessly be added as the |
| program is improved. |
|
|
| Here is how you'd go about encrypting an entire file using the HPACK |
| program using the MDC "conventional" encryption: |
|
|
| C:\> hpack A -C secret.hpk secret.txt |
| HPACK - The multi-system archiver Version 0.78a0 (shareware version) |
| For Amiga, Archimedes, Macintosh, MSDOS, OS/2, and UNIX |
| Copyright (c) Peter Gutmann 1989 - 1992. Release date: 1 Sept 1992 |
|
|
| Archive is 'SECRET.HPK' |
|
|
| Please enter password (8..80 characters): |
| Reenter password to confirm: |
| Adding SECRET .TXT |
|
|
| Done |
|
|
| Anyway, I don't personally think HPACK will ever become truly popular |
| for any reason besides its encryption capabilities. ZIP has been ported to an |
| amazing number of platforms, in which lies ZIP's encryption weakness. If you |
| think ZIP is safe, remember that you need to prevent the possibility of four |
| years of attempted password cracking in order to beat the Statutes of |
| Limitations: |
|
|
| Here is the introduction to ZIPCRACK, and what it had to say about how |
| easy it is to break through this barrier: |
|
|
| (Taken from ZIPCRACK.DOC) |
| ----- |
| ZIPCRACK is a program designed to demonstrate how easy it is to find |
| passwords on files created with PKZIP. The approach used is a fast, |
| brute-force attack, capable of scanning thousands of passwords per second |
| (5-6000 on an 80386-33). While there is currently no known way to decrypt |
| PKZIP's files without first locating the correct password, the probability that |
| a particular ZIP's password can be found in a billion-word search (which takes |
| about a day on a fast '486) is high enough that anyone using the encryption |
| included in PKZIP 1.10 should be cautious (note: as of this writing, PKZIP |
| version 2.00 has not been released, so it is not yet known whether future |
| versions of PKZIP will use an improved encryption algorithm). The author's |
| primary purpose in releasing this program is to encourage improvements in ZIP |
| security. The intended goal is NOT to make it easy for every computer user to |
| break into any ZIP, so no effort has been made to make the program |
| user-friendly. |
| ----- End Blurb |
|
|
| Likewise, WordPerfect is even more vulnerable. I've caught a copy of |
| WordPerfect Crack out on the Internet and here is what it has to say about |
| WordPerfect's impossible-to-break methods: |
|
|
| (Taken from WPCRACK.DOC:) |
| ----- |
| WordPerfect's manual claims that "You can protect or lock your documents with a |
| password so that no one will be able to retrieve or print the file without |
| knowing the password - not even you," and "If you forget the password, there is |
| absolutely no way to retrieve the document." [1] |
|
|
| Pretty impressive! Actually, you could crack the password of a Word Perfect |
| 5.x file on a 8 1/2" x 11" sheet of paper, it's so simple. If you are counting |
| on your files being safe, they are NOT. Bennet [2] originally discovered how |
| the file was encrypted, and Bergen and Caelli [3] determined further |
| information regarding version 5.x. I have taken these papers, extended them, |
| and written some programs to extract the password from the file. |
| ----- End Blurb |
|
|
| --------------------- |
| PUBLIC KEY ENCRYPTION |
| --------------------- |
|
|
| Back to the Masters of Deception analogy -- they were telephone |
| tapped. Conventional encryption is good for home use, because only one person |
| could possibly know the password. But what happens when you want to transmit |
| the encrypted data by telephone? If the Secret Service is listening in on your |
| phone calls, you can't tell the password to the person that you want to send |
| the encrypted information to. The SS will grab the password every single time. |
|
|
| Enter Public-Key encryption! The concepts behind Public-Key are very |
| in-depth compared to conventional encryption. The idea here is that passwords |
| are not exchanged; instead a "key" which tells HOW to encrypt the file for the |
| other person is given to them. This is called the Public Key. |
|
|
| You retain the PRIVATE key and the PASSWORD. They tell you how to |
| decrypt the file that someone sent you. There is no "straight" path between |
| the Public Key and the Private Key, so just because someone HAS the public key, |
| it doesn't mean they can produce either your Secret Key or Password. All it |
| means is that if they encrypt the file using the Public Key, you will be able |
| to decrypt it. Furthermore, because of one-way encryption methods, the output |
| your Public Key produces is original each time, and therefore, you can't |
| decrypt the information you encrypted with the Public Key -- even if you |
| encrypted it yourself! |
|
|
| Therefore, you can freely give out your own Public Key to anyone you |
| want, and any information you receive, tapped or not, won't make a difference. |
| As a result, you can trade anything you want and not worry about telephone |
| taps! This technique supposedly is being used to defend the United States' |
| Nuclear Arsenal, if you disbelieve this is secure. |
|
|
| I've actually talked with some of the makers of the RSA "Public-Key" |
| algorithm, and, albeit they are quite brilliant individuals, I'm somewhat |
| miffed at their lack of enthusiasm for aiding the public in getting a hold of |
| tools to use Public Key. As a result, they are about to get railroaded by |
| people choosing to use PGP in preference to squat. |
|
|
| Okay, maybe they don't have "squat" available. In fact, they have a |
| totally free package with source code available to the USA public (no |
| exportation of code) which people can use called RSAREF. Appendix E explains |
| more about why I'm not suggesting you use this package, and also how to obtain |
| it so you can see for yourself. |
|
|
| Now that we know the basic concepts of Public-Key, let's go ahead and |
| create the basics for effective tap-proof communications. |
|
|
| Generation of your own secret key (comments in {}s): |
|
|
| C:\> pgp -kg { Command used to activate PGP for key generation } |
| Pretty Good Privacy 2.1 - Public-key encryption for the masses. |
| (c) 1990-1992 Philip Zimmermann, Phil's Pretty Good Software. 6 Dec 92 |
| Date: 1993/01/18 19:53 GMT |
|
|
| Pick your RSA key size: |
| 1) 384 bits- Casual grade, fast but less secure |
| 2) 512 bits- Commercial grade, medium speed, good security |
| 3) 1024 bits- Military grade, very slow, highest security |
| Choose 1, 2, or 3, or enter desired number of bits: 3 {DAMN STRAIGHT MILITARY} |
|
|
| Generating an RSA key with a 1024-bit modulus... |
| You need a user ID for your public key. The desired form for this |
| user ID is your name, followed by your E-mail address enclosed in |
| <angle brackets>, if you have an E-mail address. |
| For example: John Q. Smith <12345.6789@compuserve.com> |
|
|
| Enter a user ID for your public key: |
| The Racketeer <rack@lycaeum.hfc.com> |
|
|
| You need a pass phrase to protect your RSA secret key. |
| Your pass phrase can be any sentence or phrase and may have many |
| words, spaces, punctuation, or any other printable characters. |
| Enter pass phrase: { Not echoed to screen } |
| Enter same pass phrase again: { " " " " } |
| Note that key generation is a VERY lengthy process. |
|
|
| We need to generate 105 random bytes. This is done by measuring the |
| time intervals between your keystrokes. Please enter some text on your |
| keyboard, at least 210 nonrepeating keystrokes, until you hear the beep: |
| 1 .* { decrements } |
| -Enough, thank you. |
| ...................................................++++ ........++++ |
| Key generation completed. |
|
|
| It took a 33-386DX a grand total of about 10 minutes to make the key. |
| Now that it has been generated, it has been placed in your key ring. We can |
| examine the key ring using the following command: |
|
|
| C:\> pgp -kv |
| Pretty Good Privacy 2.1 - Public-key encryption for the masses. |
| (c) 1990-1992 Philip Zimmermann, Phil's Pretty Good Software. 6 Dec 92 |
| Date: 1993/01/18 20:19 GMT |
|
|
| Key ring: 'c:\pgp\pubring.pgp' |
| Type bits/keyID Date User ID |
| pub 1024/7C8C3D 1993/01/18 The Racketeer <rack@lycaeum.hfc.com> |
| 1 key(s) examined. |
|
|
| We've now got a viable keyring with your own keys. Now, you need to |
| extract your Public Key so that you can have other people encrypt shit and have |
| it sent to you. In order to do this, you need to be able to mail it to them. |
| Therefore, you need to extract it in ASCII format. This is done by the |
| following: |
|
|
| C:\> pgp -kxa "The Racketeer <rack@lycaeum.hfc.com>" |
| Pretty Good Privacy 2.1 - Public-key encryption for the masses |
| (c) 1990-1992 Philip Zimmermann, Phil's Pretty Good Software. 6 Dec 92 |
| Date: 1993/01/18 20:56 GMT |
|
|
| Extracting from key ring: 'c:\pgp\pubring.pgp', userid "The Racketeer |
| <rack@lycaeum.hfc.com>". |
|
|
| Key for user ID: The Racketeer <rack@lycaeum.hfc.com> |
| 1024-bit key, Key ID 0C975F, created 1993/01/18 |
|
|
| Extract the above key into which file? rackkey |
|
|
| Transport armor file: rackkey.asc |
|
|
| Key extracted to file 'rackkey.asc'. |
|
|
| Done. The end result of the key is a file which contains: |
|
|
| -----BEGIN PGP PUBLIC KEY BLOCK----- |
| Version: 2.1 |
|
|
| mQCNAisuyi4AAAEEAN+cY6nUU+VIhYOqBfcc12rEMph+A7iadUi8xQJ00ANvp/iF |
| +ugZ+GP2ZnzA0fob9cG/MVbh+iiz3g+nbS+ZljD2uK4VyxZfu5alsbCBFbJ6Oa8K |
| /c/e19lzaksSlTcqTMQEae60JUkrHWpnxQMM3IqSnh3D+SbsmLBs4pFrfIw9AAUR |
| tCRUaGUgUmFja2V0ZWVyIDxyYWNrQGx5Y2FldW0uaGZjLmNvbT4= |
| =6rFE |
| -----END PGP PUBLIC KEY BLOCK----- |
|
|
| This can be tagged to the bottom of whatever E-Mail message you want to |
| send or whatever. This key can added to someone else's public key ring and |
| thereby used to encrypt information so that it can be sent to you. Most people |
| who use this on USENET add it onto their signature files so that it is |
| automatically posted on their messages. |
|
|
| Let's assume someone else wanted to communicate with you. As a result, |
| they sent you their own Public Key: |
|
|
| -----BEGIN PGP PUBLIC KEY BLOCK----- |
| Version: 2.1 |
|
|
| mQA9AitgcOsAAAEBgMlGLWl8rub0Ulzv3wpxI5OFLRkx3UcGCGsi/y/Qg7nR8dwI |
| owUy65l9XZsp0MUnFQAFEbQlT25lIER1bWIgUHVkIDwxRHVtUHVkQG1haWxydXMu |
| Yml0bmV0Pg== |
| =FZBm |
| -----END PGP PUBLIC KEY BLOCK----- |
|
|
| Notice this guy, Mr. One Dumb Pud, used a smaller key size than you |
| did. This shouldn't make any difference because PGP detects this |
| automatically. Let's now add the schlep onto your key ring. |
|
|
| C:\> pgp -ka dumbpud.asc |
| Pretty Good Privacy 2.1 - Public-key encryption for the masses. |
| (c) 1990-1992 Philip Zimmermann, Phil's Pretty Good Software. 6 Dec 92 |
| Date: 1993/01/22 22:17 GMT |
|
|
| Key ring: 'c:\pgp\pubring.$01' |
| Type bits/keyID Date User ID |
| pub 384/C52715 1993/01/22 One Dumb Pud <1DumPud@mailrus.bitnet> |
|
|
| New key ID: C52715 |
|
|
| Keyfile contains: |
| 1 new key(s) |
| Adding key ID C52715 from file 'dumbpud.asc' to key ring 'c:\pgp\pubring.pgp'. |
|
|
| Key for user ID: One Dumb Pud <1DumPud@mailrus.bitnet> |
| 384-bit key, Key ID C52715, crated 1993/01/22 |
| This key/userID associate is not certified. |
|
|
| Do you want to certify this key yourself (y/N)? n {We'll deal with this later} |
|
|
| Okay, now we have the guy on our key ring. Let's go ahead and encrypt |
| a file for the guy. How about having the honor of an unedited copy of this |
| file? |
|
|
| C:\> pgp -e encrypt One {PGP has automatic name completion} |
| Pretty Good Privacy 2.1 - Public-key encryption for the masses. |
| (c) 1990-1992 Philip Zimmermann, Phil's Pretty Good Software. 6 Dec 92 |
| Date: 1993/01/22 22:24 GMT |
|
|
|
|
| Recipient's public key will be used to encrypt. |
| Key for user ID: One Dumb Pud <1DumPud@mailrus.bitnet> |
| 384-bit key, Key ID C52715, created 1993/01/22 |
|
|
| WARNING: Because this public key is not certified with a trusted |
| signature, it is not known with high confidence that this public key |
| actually belongs to: "One Dumb Pud <1DumPud@mailrus.bitnet>". |
|
|
| Are you sure you want to use this public key (y/N)? y |
| ------------------------------------------------------------------------------ |
|
|