Results 1 to 5 of 5
  1. #1
    Join Date
    Jul 2006
    Location
    Arizona
    Posts
    2

    Knowledge normalization

    Hello

    I am a developer of artificial intelligent / natural language processing computer systems. My career of 40 years has been quite an adventure and learning experience. My work has evolved into a methodology (new paradigm) for 'knowledge normalization', merging AI with NLP. I have published 2 articles, one describing the successful installation of knowledege normalization and a second article describing more details about the new paradigm.

    My question to DB developers is, what will be the impact of creating knowledge bases from normalized knowledge upon the methodology for data normalization? Will the first replace the latter, or will there be a merging of interface protocols?

    Regards.

    Nicholas Zendelbach

    Articles:
    The IDC Story: The First Successful AI Based Multi-Expert System In Arizona.
    Examine a multi-expert system generator, Rose Navigator, and an Enterprise Resource Plan to help manage the need for human engineers against the dynamics of customer expectations and orders.
    Pages: 39 through 45, also pages 1 and 5.
    www.pcai.com/web/6a72/522.11.42/TOC.htm copy and paste to browser

    JUNE 2004 Publication: PCAI Magazine.
    The Heuristic Life Cycle of a multi-Expert system.
    Introduction, the purpose of this article is to introduce a new paradigm in the discipline of engineering human knowledge.
    Abstract This article introduces a new paradigm to the discipline of engineering human knowledge, one that we divide into four tenets of knowledge representation:
    1. The four prime domains of knowledge.
    2. All human knowledge has, at its root, a language to communicate the knowledge.
    3. A single language sentence contains the smallest unit of knowledge, and it is possible to normalize and codify this unit of knowledge into a multi-expert computer system (Language representation).
    4. A knowledge based computer system can learn as well as teach.

    This paradigm, as illustrated in this article, is the result of research and development and the resulting creation of a multi expert system generator. The methodology of the multi-expert system generator is a self-designing system it constructs and designs attributes that are an integral part of the methodology, process and architecture used to generate the multi-expert system.

    Magazine Article:
    http://www.pcai.com/web/6t6y6t/6t6y6y.7.02/TOC.htm copy and paste to browser.

  2. #2
    Join Date
    Apr 2002
    Location
    Toronto, Canada
    Posts
    20,002
    um, okay

    i especially like the concept of merging interface protocols, that was always enjoyable every time i did it -- like they say, when it's good it's really good, and when it's bad it's still pretty good
    rudy.ca | @rudydotca
    Buy my SitePoint book: Simply SQL

  3. #3
    Join Date
    Jul 2006
    Location
    Arizona
    Posts
    2
    Thank you for the interest in my work. Can you recommend a logical pattern that would relate a normalized knowledge base (CAMBO)(a single English Grammatical Sentence is a single knowledge element, and normalized much the same way as data elements) to a Data Normalized data base?

    Nicholas Zendelbach

  4. #4
    Join Date
    Apr 2002
    Location
    Toronto, Canada
    Posts
    20,002
    no, i'm sorry, i cannot

    probably due to the fact that i have no idea what any of this is about
    rudy.ca | @rudydotca
    Buy my SitePoint book: Simply SQL

  5. #5
    Join Date
    Feb 2004
    Location
    In front of the computer
    Posts
    15,579
    The biggest problem you're likely to face is that NLP is by definition NP-incomplete. There are an infinite number of ways to express non-trivial assertions in a natural language. You can use languages that approximate natural language (Experanto is a good example, especially for knowledge processing), but those are decidedly NOT true NLP.

    As you've noted the KE / Knowledge Normalization is dependant on some way to represent that knowledge. Traditional systems have done this via XM systems for the internal (usually Lisp) structures used to represent the underlying knowledge. I don't know of a successful system that uses an abstract form (such as NL) to represent the knowledge structures. I'd be interested in hearing more about that.

    IDC had a fairly good reputation in the valley, but I don't remember you. The turnover there was always dizzying, due to the prevailing "parts is parts" attitude about staff. I assume that you're working from Portland or at least somewhere outside of the valley.

    -PatP

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •