EXAMINATION  FOR  PGDipSci  MSc  ETC  2000

 

COMPUTER SCIENCE

 

Software Security

 

(Time allowed: TWO hours)

 

Note:          Attempt ALL questions in the 12-page script book provided.  Each question is worth 10 marks.  There are eleven questions on this exam.  Your marker will omit your lowest-scoring answer when totalling your marks.  Total possible: 100 marks.

 

1.                  Consider the four types of security threat, defined in Pfleeger’s book: interception, interruption, modification, fabrication.  Also consider the following clause in the “Guidelines for the Use of University Computing Facilities and Services:

Users shall normally use computing facilities and services as they are provided, without attempting to modify or subvert them.  This includes: … not attempting to modify system facilities, to install viruses, to obtain illegally extra resources or to degrade the performance of any system.

Which of Pfleeger’s threats are controlled by this clause?  Explain your answer briefly, in approximately 50 words.

This clause controls the “modification” of any of the University’s computing  resources: hardware, software and data.  It specifically controls modifications that might “interrupt” other users’ access  (e.g. by the installation of viruses, or degradation of performance).  It also controls the “interception” of any extra resources that might be obtained illegally.

2.                  Consider the ethical systems discussed in class, which include

·        Pfleeger’s “universal, self-evident, natural rules” (right to know, right to privacy, right to fair compensation for work);

·        Sir David Ross’ duties (fidelity, reparation, gratitude, justice, beneficence, nonmaleficence, self-improvement);

·        Christian ethics (Mosaic law, faith, hope, love, charity, Golden Rule);

·        Confucian ethics (Jen, Chun Tzu, Li, Te, Wen);

·        Islamic ethics (economic, social, military, religious).

Using any one of these ethical systems, analyse the following situation.

Your employer asks you to obfuscate some computer code, which was obtained by reverse-engineering a competitor’s product.  The primary purpose of the obfuscation, according to your employer, is to make it more difficult for the competitor to prosecute your company for violating the competitor’s copyright on their code.

In approximately fifty words, describe your ethical analysis.  Your answer should clearly indicate what your next action would be, i.e. you might start obfuscating the code, or you might discuss an ethical difficulty with your employer.  Your answer should also clearly indicate what ethical principles are relevant to your chosen action.  Your answer will be marked only on the clarity of your ethical reasoning, and not on the principles or actions you choose.

I’d use Pfleeger’s rules.  I believe my competitor has some rights to fair compensation, which should be balanced against my employer’s “right to know” (and to make constructive use of this knowledge).  I would discuss the situation with my employer, to seek an answer to the following question.  Am I being asked to disguise a copyright violation, or am I being asked to make it harder for our competitor to accuse us (incorrectly) of copyright violation?

3.                  Consider the following three goals for the security of Java bytecode: confidentiality, integrity, availability.  Name and briefly describe one software control (or countermeasure) that might preserve each of these goals.  Your answer should be in the form of three very short paragraphs, where each paragraph discusses one of the three goals.

Confidentiality can be increased with obfuscation, which will make it more difficult for a reverse engineer to capture the intellectual property that would be released in unobfuscated Java bytecode.

Integrity can be increased by “signing” Java bytecode, using (for example) the digital-signature technology discussed in Gong’s article.  Unauthorised changes to the bytecode would be detected.

Availability can be increased by using a robust JVM.  As discussed in Dean’s article, some JVMs offer more protection than others, against malicious-applet attacks that might disable or degrade system availability.

4.                  Consider Hauser’s proposal for software distribution, in which three payment transactions are defined:

a)      Consumer C pays a small amount of money (perhaps $20) to a vendor V.  The consumer C receives software S, and promises to pay a bill B.  The bill B contains the following information: the customer’s ID, an amount of money M (perhaps $280) that the customer promises to pay to a software producer P, the date, and a unique transaction number.

b)      Consumer C sends a copy of bill B to the producer P, along with the correct amount of money M.

c)      Producer P sends a copy of bill B to the vendor V, along with a moderate amount of money (perhaps $50).

Compare Hauser’s proposal with the two payment transactions in a typical distribution method:

a')     Vendor V pays an amount of money (perhaps $230) to a software producer P, obtaining in return a licensed copy of software S.

b')    Consumer C pays a large amount of money (perhaps $300) to vendor V, receiving the licensed copy of software S.

Briefly explain why Hauser’s proposal might reduce the incentive for a dishonest vendor to manufacture and sell pirated software S’ to an honest customer C.

An honest vendor will make a gross profit of $70 on each sale, in either system.

A dishonest vendor can make a gross profit of $300 per sale in the traditional distribution system, assuming that only a very rare (expert) customer can tell the difference between pirated and validly-licensed software.

A dishonest vendor in Hauser’s system could make a gross profit of $300 per sale only if they were clever enough to convince an honest customer to pay the bill for $230 to V rather than to P.  The dishonest vendor thus has to solve an additional problem in Hauser’s system (beyond that of producing pirated software): they must either produce a fraudulent bill B, or they must convince the honest customer that Hauser’s system is not being used by producer P.

5.                  The 1KP protocol does not offer “non-repudiation” for messages sent by buyers and merchants.

By contrast, the 2KP protocol will provide a buyer with a “non-repudiable” receipt R for a sales transaction with a merchant.  By “non-repudiable” we mean that the merchant would be unable to repudiate (refuse to acknowledge) that they (the merchant) constructed this receipt R.

Could any of the following messages be used as a non-repudiable receipt R?

a)      Message M1, consisting of the following information in plaintext: sales price, transaction ID, buyer’s ID, merchant’s ID, date and time of the transaction, a description of what was purchased and the address where it should be delivered;

b)      Message M2 = H(M1), where H(M1) is the 128-bit value of a one-way hash function H() applied to message M1 described above;

c)      Message M3 = (M1, Em(M2)), where Em(M2) is an encryption, using the merchant’s public key, of the 128-bit hash value M2 = H(M1);

d)      Message M4 = Sb(M3) = Sb(Em(M2)), where Sb(M3) is an encryption, using the buyer’s private key, of message M3 described above.

Very briefly discuss each of the messages above, indicating whether the merchant would be able to make a convincing argument that someone other than the merchant could have constructed this message.

Message M1 could be produced by anyone who has seen a similar receipt produced by this merchant.  Message M2 could be produced by anyone who has seen a similar receipt, and who knows how to compute H() – which can’t be kept a secret.  Message M3 could be produced by anyone who knows the merchant’s public key and encryption method – neither of which is a secret.  Message M4 could be produced by anyone who has a public/private encryption key pair, which is easily obtained.  Therefore none of these messages is non-repudiable.

6.                  The Resource ReserVation Protocol (RSVP) allows particular users to obtain preferential access to network resources.  For example, a user might employ RSVP to set up a path for a high-quality video stream.

During this term, you read a proposal for adding the following information to each RSVP message:

·        a 48-bit key identifier i, unique to the sender, indicating the index (but not revealing the value) of the sender’s encryption key ki which was used to calculate an encryption E(D) of the data portion D of this message;

·        a 64-bit sequence number, unique to this message, with a value that is larger than any sequence number sent on any previous message from this sender.

·        a 128-bit secure message digest, which is the result of applying a one-way hash function H() to the encryption E(D).

Which of the following statements best describes a security feature that might be provided by these additional information fields:

a)      The sender will be protected from denial-of-service attacks;

b)      The receiver will be protected from denial-of-service attacks;

c)      Unauthorised users will be unable to obtain network resources by modifying RSVP messages.

Explain your answer briefly.

Statement c is the best description.  An unauthorised user, who attempts to modify an RSVP message, must know the encryption key ki.  Otherwise they’re incredibly unlikely to pick an appropriate modification to the message digest.  Therefore, as long as the encryption key stays a secret, RSVP messages are secure against modification.  (Note: the sequence number protects against replay attacks.)

7.                  Briefly describe one security weakness in Kerberos, which does not involve a brute-force search for the authentication server’s decryption key.

The client’s first message to the authentication server contains their secret key.  If this message is intercepted, then someone will be able to impersonate the client in future sessions with the authentication server.

Another known weakness: Kerberos is susceptible to denial-of-service attacks.  If the authentication server is unavailable, then new tickets cannot be produced.

8.                  Wallach et al. describe three methods for implementing extensible security in Java: extended stack introspection, capabilities, name space management.  Which of these methods is most similar to the “alias” technique employed in the “padded cell” security model of Safe-Tcl?  Explain briefly.

Name space management is essentially identical to the “alias” technique.  The interpreter (for Java or Tcl) can enforce security by controlling how names in a program are dynamically linked to runtime classes (in Java) or to commands (in Safe-Tcl).  Untrusted scripts will be given very restricted or no access to unsafe commands.

9.                  Characterise the security of the “about: Mozilla” Easter Egg in Netscape.  Your brief answer should address each of Matheson’s three criteria for watermarks: fidelity, robustness, security.

The Easter Egg has excellent fidelity, as it does not affect the normal operation of Netscape in any way (except for users who want to see information about Mozilla).

The Easter Egg has excellent robustness, as it would not be removed by normal operations on Netscape (e.g. transmission, compression, installation).

The Easter Egg may have poor security.  It is probably not very difficult to attach a debugger to Netscape, thereby discovering the code that generates the Easter Egg.  Removal or modification might not be very time-consuming unless the code is cleverly obfuscated or tamperproofed.

10.              A software developer can use technology in JTimer to generate

·        a public/private encryption key pair, and

·        an encrypted “ticket” that can be included in an application.

The developer should provide licensed users with a copy of the public key.  The protected application should call JTimer’s checkTicket() method, to verify that the user has an appropriate public key for the “ticket” included in the application.  An invalid ticket or an incorrect public key will cause the checkTicket() method to return False, in which case the protected application should print a message inviting the user to purchase a license.

Briefly explain how this licensing scheme might be defeated.

Print out the class files using javap.  Search for the checkTicket() method, and examine the conditional tests in this method.  One path through the method will generate the result True.  Modify the conditional-branch bytecodes (e.g. replace “ifne” by “goto”) so that all paths through the checkTicket() method will generate True.  Put the modified class in the same place the original was found.


11.              Consider the three “if” statements in the following code, which was generated automatically by a software-security tool:

Class A{

      static A[] nd = new A[ 300 ];

      A x, y;

      static public void s( int i1, int i2, int i3 ) {

            if( nd[i1]==null ) nd[i1] = new A( );

            if( nd[i2]==null ) nd[i2] = new A( );

            if( nd[i3]==null ) nd[i3] = new A( );

            nd[i1].x = nd[i2];

            nd[i1].y = nd[i3];

      }

}

 

 

If you believe these “if” statements are an obfuscation, your answer should briefly indicate what (if anything) in the original code was replaced by these statements.  If you believe these “if” statements are not an obfuscation, your answer should briefly describe what you think is their most likely purpose.

These three “if” statements seem to be creating part of a watermark, in particular they are conditionally creating three nodes in what might well be a planted planar cubic tree containing up to 300 nodes.