Processing math: 100%
Close
Close Window

Show Source |    | About   «  8.1. Properties of Context-Free Languages   ::   Contents   ::   9.2. Turing Machine Exercises  »

9.1. Turing Machines

9.1.1. A General Model of Computation

So far we have seen a few simple machine types, such as DFA, NFA, PDA. And we have seen corresponding grammars: Regular grammars, context-free grammars, etc. We have investigated the concept of non-determinism. Nondeterminism does not affect the power of a DFA (the languages that are accepted by NFAs are the same as for DFAs). But nondeterminism does affect the power of PDAs (nondetermistic PDAs recognize more languages than do determistic PDAs).

These machines all have some similarities. They all take an input string. They all march across the string from left to right, one character at each step. They stop when they reach the end of the string and then make a simple decision: If the machine is in a final state, then it accepts the string, and otherwise it rejects the string. In other words, they are acceptors for some language. All that they can do is determine if a string is a member of the language, and the “more powerful” machines can accept or reject string from more “complicated” languages.

But there is a lot more that we typically expect when we talk about “computation”. Like, well, the ability to “compute” rather than simply accept. What does “compute” actually mean? For example, we would like to be able to take as input the arithmetic expression “2+3” and compute the value 5. This means that we need to be able to output that computed value 5. Or put another way, we can “compute” a value by taking the input string “2+3” and replacing it with the output string “5”.

In its most general form, we can think of everything that a computer does as taking some string as input, and then providing some string as output. Of course, modern peripheral devices like keyboards, mice, and computer monitors give us rich ways to express input strings (perhaps as button presses), and rich ways to interpret output strings (say, as pictures). But it’s not a huge stretch of the imagination to consider computation as converting an input string to an output string.

This concept of converting is far more powerful than simple accepting. A machine that takes an input and provides an output is called a transducer.

In the next section, we will introduce a simple machine, the Turing Machine, that is a transducer. It is only slightly more complicated than the machines that we have seen so far, and only slightly different in its operation. But these differences are significant. Ultimately, we will see that a Turing Machine can do any computation (in its fundamental abilty to convert one string into another) that even the most sophisticated modern computer can do.

Another issue is memory. DFAs have no memory beyond whatever process took them to their current state. PDAs have a stack. This made a huge difference in the languages that can be accepted: DFAs only accept regular languages, while (nondeterministic) PDAs accept any CFL.

Consider these three languages:

  • L1={wcwR} for w in Σ,Σ={a,b}.

  • L2={wwR} for w in Σ,Σ={a,b}.

  • L1={ww} for w in Σ,Σ={a,b}.

The differences seem pretty minor, but the outcomes are quite different. L1 is a deterministic CFL. L2 is a non-determistic CFL. L3 is not a CFL at all.

The difference between L3 and the others seems to relate to the limitiations of the stack. As we read the first w and load it into the stack, we don’t have a good way to get to the bottom of the stack to see the first letter to compare against the first letter of the second w. But what if we considered other memory models? For example, a queue would solve our problem nicely. But then, a queue would not be able to handle L2. Two stacks could simulate a queue (do you see how?), and so a machine with two stacks can handle both L1 and L2. Perhaps we should investigate what class of languages a two stack machine can handle? Perhaps we should consider other memory models?

As it turns out, none of these ideas are as effective as the simple memory model that the Turing Machine uses.

9.1.2. Turing Machines

1 / 47 Settings
<<<>

We now introduce the Turing Machine. A Turing Machine is a transducer, meaning that it can modify the input tape to generate an output. It is only slightly more complicated than the machines that we have seen so far, and only slightly different in its operation. But these differences are significant. Ultimately, we will see that a Turing Machine can do any computation that even the most sophisticated modern computer can do.

Created with Raphaël 2.1.2
Proficient Saving... Error Saving
Server Error
Resubmit

9.1.3. Interpreting Turing Machines

1 / 26 Settings
<<<>

A configuration for a Turing machine provides enough information to understand its current state of execution on a given string. We use the following notation:$(q, \underline{a}aba)$

This means that the TM is in state $q$, the tape contains $aaba$, and the read/write head position is on the underlined $a$. Recall that we assume at the start of processing input for any TM, the read/write head position is on the leftmost non-blank character.

Don't forget that the tape is infinite in both directions. So to the left of the leftmost 'a' in this configuration are an infinite number of blank squares, and to the right of the rightmost a is also an infinite number of blank squares.

Created with Raphaël 2.1.2
Proficient Saving... Error Saving
Server Error
Resubmit

9.1.4. Turing-Decidable vs. Turing-Acceptable Languages

1 / 14 Settings
<<<>

Recall that a Turing machine can accept or reject a string. We also noted that a Turing machine can go into an infinite loop. We will now examine that concept more closely.

Proficient Saving... Error Saving
Server Error
Resubmit

9.1.5. Making More Complicated Machines

Obviously, Turing Machines can take an input and modify it. We will see examples of how this leads to powerful computational capability, even if it does not seem yet like they are so powerful. To get a quick idea of their power, consider the following relatively simple machine to accept L(anbncn). This is significant, because this language is in fact not context free. Which means that this simple Turing Machine is doing something that no DFA, NFA, or PDA can do!

1 / 63 Settings
<<<>>>

In this slideshow, we will trace the acceptance or rejections of some strings. The given machine can accept $L = \{a^nb^nc^n: n> 0\}$.

Created with Raphaël 2.1.2
Created with Raphaël 2.1.2
q0
q1
q2
q3
q4
q5
q6
a;A,R
B;B,R
a;a,R
b;B,R
C;C,R
b;b,R
c;C,L
a;a,L
B;B,L
C;C,L
b;b,L
A;A,R
B;B,R
B;B,R
C;C,R
#;#,L
C;c,L
B;b,L
A;a,L
#;#,R
Proficient Saving... Error Saving
Server Error
Resubmit


1 / 24 Settings
<<<>

While Turing machines might be able to do powerful things, when operating at the individual state level it can get rather difficult and tedious to program them. In fact, it might feel in some ways like writing machine code or assembly language.

The secret to success in modern software development is to build up more powerful tools, especially by packaging behavior together and manipulating the packages through specific interfaces.

We can hope to build up similar capability with Turing Machines.

Created with Raphaël 2.1.2
Proficient Saving... Error Saving
Server Error
Resubmit


1 / 18 Settings
<<<>>>

In this slideshow, we will see how a copy machine processes input string 'ab'.
The following machine is a copy machine that can transform #w# into #w#w#:
(Note: w represents the input string ('ab' in this case).)

Created with Raphaël 2.1.2
$R$
$\#$
$R^{2}_{\#}$
$\sigma$
$L^{2}_{\#}$
$\sigma$
$R_{\#}$
$\overline{\#}$
$\#$
$>L_{\#}$
Proficient Saving... Error Saving
Server Error
Resubmit

9.1.6. Turing Machine Extensions

1 / 4 Settings
<<<>>>

9.2.1.6. Turing Machine Extensions

When we give extensions or new functionality to a computing system, sometimes they change something fundamental about the capabilities of the system. For example, when we add non-determinism to an algorithm, we might change the cost of the underlying problem from exponential to polynomial time. But, other changes do nothing fundamental. In terms of Turing machines, our concern is what the machine can do, rather than how long it takes to do it. Does non-determinism help us to solve the Halting problem? No. Likewise, the following extensions do not increase the power of Turing Machines.

Created with Raphaël 2.1.2
Proficient Saving... Error Saving
Server Error
Resubmit

   «  8.1. Properties of Context-Free Languages   ::   Contents   ::   9.2. Turing Machine Exercises  »

nsf
Close Window