Local Search Algorithms
- 많은 최적화 문제에서, paths는 irrelevant하다. The goal state is the solution
- Then state space = set of ‘complete’ configurations;
- find configuration satisfying constraints (n-queens problem) or find optimal configuration (travelling salesperson problem)
- In such cases, we can use iterative improvement algorithms
- keep a single ‘current’ state, try to improve it
- constant space, suitable for online as well as offline serch
More or less unavoidable if the ‘state’ is yourself
- 장점
- 매우 적은, 일정한 메모리만 사용
- 크거나 무한한 상태 공간에서 합리적인 해결책을 탐색
Hill Climbing
- Idea
- Start wherever
- Repeat: move to the best neighboring state
- If no neighbors better than current, quit
Simulated Annealing
- Idea
- Annealing process
- used to cool metals slowly to reach an ordered (low-energy) state
- ‘나쁜’ moves를 때때로 허용
- 높은 온도에서, 더 나쁜 움직임이 허용
- → shake the system out of its local minimum
- 점차적으로 온도 감소 according to some schedule
- Annealing process
- Converge
- The more downhill steps you need to escape a local optimum, the less likely you are to ever make them all in a row
- ‘Slowly enough’ may mean exponentially slowly
- Pandom restart hillclimbing also converges to optimal state
Local Beam Search
- Basic idea:
- K copies of a local search algorithm, initialized randomly
- For each iteration
- Generate All successors from K current states
- Choose best K of these to be the new current states
- Why is this different from K local searches in parallel?
Genetic Algorithms
- Idea
- Genetic algorithms use a natural selection metaphor
- 각 단계(선택)에서 K개의 개체를 적합도 함수에 따라 가중치를 부여하여 resample
- Combine by pairwise crossover operators, plus mutation to give variety (다양성 부여를 위해 돌연변이를 추가)
Countinuous State Spaces (연속 상태 공간)
- EX: Siting Airports in Romania
- f(x): 각 도시에서 가장 가까운 공항까지의 거리 제곱의 합
- = Sum{(xa-xc)^2+(ya-yc)^2}
- Objective: minimize f
- Discretize(이산화) it!
- Define a grid with increment 𝛿, use any of the discrete algorithms
- 연속 공간을 이산 공간으로 변환
- Choose random perturbations to the state
- First-choice hill-climbing: keep trying until something improves the state
- Simulated annealing
- Compute gradient of 𝑓(𝑥) analytically
- gradient를 업데이트 하여 f를 조절
Summary
- 많은 Configuration & Optimization 문제를 Local search로 공식화할 수 있다.
- General families of algorithms
- Hill-climbing, continuous optimization
- Simulated annealing (and other stochastic(확률적) methods)
- Local beam search: multiple interaction searches
- Genetic algorithms: break and recombine state
- Many machine learning algorithms are local searches
'인공지능 > Artificial Intelligence (COSE361)' 카테고리의 다른 글
[인공지능] Chapter 7. Adversarial search (1) (1) | 2023.10.25 |
---|---|
[인공지능] Chapter 6. Constraint Satisfaction problems (2) | 2023.10.24 |
[인공지능] Chapter 5. Propositional logic (0) | 2023.10.24 |
[인공지능] Chapter 3. Informed Search (0) | 2023.10.24 |
[인공지능] Chapter 2. Search (0) | 2023.10.24 |