Skip to main content

Search

Computation

processes where each step is always necessary to achieve goals → predictability

processes where after they finished you can identify steps that did not contribute to achieving the goals

No free lunch theorem

"For every search system there is a search instance that shows the worst case behavior"

Model (most important thing)

A = (S, T)
Model describes all possibilities. During search, sys in one of them & move around using T

S set of possible states aka search space
T SXS\subseteq S X S subset of available transition

Process

to select next search state, integration of env info
essential fo efficiency
P = (A, Env, K)

  • A: search model
  • Env: sth external to model (data from sensor input)
  • K: S x Env → S control - thing that makes decision
    K(s, e) = s'

Instance

Ins = (s0, G) selection of a singular ver of a prob
Search control just moving between states so we need to give it a goal to know when it should stop

  • s0 \in S - start state
  • G: S → no - goal condition

example

Using Gg Map Instance: finding path from Uni to Home Goal: show me a path to reach my destination

Derivation

History of search I have an ins, started at s0, then I apply my process which results in a sequence of future states. That sequence of states is search derivation

State vs Environment

Environment

Data outside of knowledge base, things dont know from the start
Data never changes during search

State

anything that relates to the solution, result of reasoning or knowledge youre building

Questions

  • Computation: processes where each step is always necessary to achieve their goals
  • Search: processes where after they finished you can identify steps that did not contribute to achieving the goals

What are the advantages of computation, what are the disadvantages?

  • Advantage: Predictable runtime Not dealing with choices Not dealing with unnecessary steps

  • Disadvantage: Representation is too implicit → difficult to know whats going on Not all problem is achievable

What is the ‘No Free Lunch Theorem’?

"For every search system there is a search instance that shows the worst case behavior"

What is the definition of a search model? (be able to describe the parts) aka What are the components of a model?

Model defined

  • main data structure and possibilities
  • what control can work with
  • limit choices of the control

Components

A = (S, T) S: set of all possible states T: transitions |.| states T \subset S x S

What is the definition of a search process? (be able to describe the parts)

Definition

Search process defines:

  • how to deal with indeterminism of search model
  • deal with all possible states & search you want to perform

Components

P = (A, Env, K)
A: search model
Env: environment sth external to the model (data from sensor input)
K: S x Env → S - search control make decision to transit from s to s'
K(s, e) → s'

What is the definition of a search instance? (be able to describe the parts) aka What are the components of a search ins?

Definition

Search ins defines:

  • concrete input for search run
  • defines when search ends
  • normally generated out of user input

Components

Ins(s0, G)

  • s0 \subset start state
  • G: S → {yes, no} tells which state is yes which is no G(si) = result where si \subset S, result = yes if search is done, no otherwise

What does goal function tell you?

Termination. Tell you when your search is done

What is the definition of a search derivation? (be able to describe the parts)

Search process P applies on Ins leads to a sequence of states (history of search) s0, ...., si,... with K(si, ei) = si-1, si, si+1 \in S, ei \in Env

→ protocols a search run to analyze quality of search control & determine solution

For what do we need the components of a search model, process, instance, derivation?

  • S: to define all possible states

  • T: to define all possible transition between states With that, search model can defines the structure and what search control has to deal with

  • A: to define the model that search process need to deal with

  • Env: external data that search control need to take into consideration

  • K: make decision based on current state and env

  • s0: to know the initial state

  • G: to know halting state

  • to verify search control quality & may be looked at to determine solution

What is a state/search space? What is a value space/fitness space?

Search space is S aka all possible states

  • Env: Databases, things that doesnt change throughout the search, thing isnt included in the knowledge base from the start
  • State: things related to solution, reasoning or results of what you're building