Assignment 3

  1. ACL 8.5 (4 pts)

    Henley

    Think for a moment about how Henley works:

    1. Call read-text() to loop through all the characters in a file, until EOF.
    2. When you reach a character that isn't an alpha character or an apostrophe ('), call see().
    3. Look up the previous word in the special (Yikes!) hash table, and get an assoc list back. If the current word is in the list, then add 1 to the number of occurences of the word. If the current word isn't on the assoc list, then add it and set is occurence to 1.

      After this, you can begin generating text by:

    4. Call generate-text() which then invokes random-next() on the "previous" word.
    5. Then random-next() plucks the assoc list from the hash table and picks a word at random, with probability in proportion to its occurence.

    What does this suggest as a strategy for checking quotes? Reverse the process that Henley uses: each time you check a word, it must appear in the hashed assoc list of the previous word. Otherwise, Henley could not have chosen it to produce the text in question. If you wanted to go further, you could check whether the frequency of occurence in the text in question matches the frequency of occurence in the original text, but this requires a bit of statistics. (I didn't expect anyone to do this.) You still need read-text() but now instead of calling see() to update the hash table, you want to call a function (say, seen?()) to check whether it's already appeared. I've renamed the version that calls seen?() to read-text?() to allow me to put the code for reading, generating and checking in one file.

    (defun henley? (filepathname)
      (read-text? filepathname))
    (defun read-text? (pathname)
      (with-open-file (s pathname :direction :input)
        (let ((buffer (make-string maxword))
              (pos 0))
          (do ((c (read-char s nil :eof) 
                  (read-char s nil :eof)))
              ((eql c :eof) t)							;; Return t if we cycle through all
            (if (or (alpha-char-p c) (char= c #\'))
                (progn
                  (setf (aref buffer pos) c)
                  (incf pos))
              (progn
                (unless (zerop pos)
                  (unless (seen? (intern 
                                  (string-downcase         ;; Changed call to seen?()
                                   (subseq buffer 0 pos))))
                    (return nil))
                  (setf pos 0))
                (let ((p (punc c)))
                  (if p (seen? p)))))))))
    (let ((prev '|.|))  
      (defun seen? (symb)
        (let ((pair (assoc symb (gethash prev *words*))))
          (if (null pair)
              (progn
                (format t "Mismatch found: \"~A\" cannot be followed by \"~A\"" prev symb)
                nil)    ; Word not in assoc list
            (setf prev symb))))) ; If it's in the assoc list, then setf will return non-nil

  2. AIMA 2.5 (4 pts)

    This question asked you to extend the vacuum environment by adding things like L-shaped rooms and furniture. In addition, you should have created at least 3 different environments with a mix of the features described. Your write-up should incldue a table that describes the performance of at least 3 different agents in the 3 different environments. Comment on the results.

    The bulk of what you needed was already in agents/vacuum.lisp. To implement an agent, you needed to write a new program, that you would then pass as a keyword argument:

    (defstructure (reactive-vacuum-agent 
       (:include agent
        (program 
         #'(lambda (percept)
    	 (destructuring-bind (bump dirt home) percept
    	   (cond (dirt 'suck)
    		 (home (random-element '(shut-off forward (turn right))))
    		 (bump (random-element '((turn right) (turn left))))
    		 (t (random-element '(forward forward forward
    					      (turn right) (turn left))))))))))
      "When you bump, turn randomly; otherwise mostly go forward, but
      occasionally turn.  Always suck when there is dirt.")

    Most people handled this part okay, but had trouble in creating their own environment. The crux of the problem was this: the AIMA code is object-oriented and people inherited from the parent of the vacuum world class, but then didn't override the methods as vacuum world did. Take a look at environments/vacuum.lisp. Notice the defstrcuture for the vacuum-world class inherits from grid-environment:

    (defstructure (vacuum-world (:include grid-environment
        (size (@ 8 8))
        (aspec '(random-vacuum-agent))
        (cspec '((at all (P 0.25 dirt))))))
      "A grid with some dirt in it, and by default a reactive vacuum agent.")

    The crucial observation is the overriding of methods that occurs in the class. Let's examine the get-percept() method for grid environments -- but it's not there. You have to follow the inheritance one more level up to basic-env.lisp. For basic environments, however, there isn't enough information known to establish what the percepts are -- if I tell you your agents in an environment, but I don't specify whether it's an autonomous vehicle roving the Martian surface or shopbot on the Web, you don't have enough information to establish the percepts. So the get-percept() method is:

    (defmethod get-percept ((env environment) agent)
      "Return the percept for this agent."
      (declare-ignore env agent)
      nil)

    The same is true for a grid environment. In a vacuum world, the world's constrained enough to allow you to determine the percepts:

    (defmethod get-percept ((env vacuum-world) agent)
      "Percept is a three-element sequence: bump, dirt and home."
      (let ((loc (object-loc (agent-body agent))))
        (list (if (object-bump (agent-body agent)) 'bump)
    	  (if (find-object-if #'dirt-p loc env) 'dirt)
    	  (if (equal loc (grid-environment-start env)) 'home))))

    Notice that this method is specialized for vacuum worlds -- the first argument is a vacuum-world environment which is bound to the symbol env. If you inherited from grid environment, and didn't override the default get-percept() method, you'd get an error when running the environment simulator with run-environment().

  3. AIMA 2.7 (1 pt)

    Create a randomly sized room with a chance of dirt in each square.

    For this problem, you really only needed to figure out the correct call to make-vacuum-world(). If you used the random-integer() function defined utilities/utilities.lisp, then the whole answer is:

    (make-vacuum-world :p 0.05
                       :x-size (random-integer 8 15)
                       :y-size (random-integer 8 15))
  4. AIMA 3.4 (4 pts)

    Missionaries and cannibals

    Mos of this was already set up for you in search/domains/cannibals.lisp.

  5. AIMA 3.5 (3 pts)

    Negative costs and looping.

    1. There are really two answers to this question. The "textbook" answer is: No, the algorithm must still explore every branch because any branch could lead to a sequence of negative-cost transitions, or a negative cost loop.

      However, for most uniform-cost searches, the node costs are only used to decide among several alternative nodes. Thus, only the relative, not the absolute, costs are important. If you add c to the cost of each node, then will preserve the ordering of nodes by cost, but you won't have any negative node costs, and you won't need to explore every node. For this to work, however, you have to know c in advance, and the problem wording is a bit ambiguous on this.

    2. A cost-minimizing agent would loop forever, endlessly decreasing the path cost. The only alternative would be to follow another loop with a smaller (more negative) cycle cost.
    3. A crucial point is that we live in a world where time marches on. In a sense, you can never return to the exact same state. One way this manifests itself is in decreasing rewards (negative costs), what economists call diminishing marginal returns. If you drive along a scenic loop, it will probably be less rewarding the second time around, and even less so the third. How can you reflect this in your own agent design? Include the mental state of the agent in the state of the world. Then, the agent's mental state ("I'm tired of looking at the same vista over and over") can be a factor in determining what to do next.
    4. One clue to this is in the previous answer: the menatl state of the agent is important. Looping behavior is sometimes seen in psychological disorders. For example, obsessive-compulsive disorder (OCD) sufferers might wash their hands over and over, or check that the front door is locked repeatedly. Addiction is another example.
  6. AIMA 3.7 (3 pts)

    Pathless problems

    A problem still has an initial state, operators and a goal test. You don't need a path cost function though, and you don't need node costs. A solution is any state that passes the goal test. The General-Search algortihm is the same, except the queue only has states, not nodes.

    (defun general-search-pathless (problem queuing-fn)
      "Expand states according to the specification of PROBLEM until we find
      a solution or run out of states to expand.  The QUEUING-FN decides which
      state to look at first."
      (let ((states (make-initial-queue problem queuing-fn))
    	state)
        (loop (if (empty-queue? states) (RETURN nil))
    	  (setq curr-state (remove-front states))
    	  (if (goal-test problem curr-state) (RETURN curr-state))
    	  (funcall queuing-fn states (expand state problem)))))

    Which problems are good for pathless analysis? Cryptarithmetic, n-queens and VLSI are all good candidates. Nobody cares in what order you substituted for letters in a cryptarithmetic problem -- only the set of all substitutions at the end.