约 3,410,000 个结果
在新选项卡中打开链接
  1. In digital computing and telecommunications, a unit of information is the capacity of some standard data storage system or communication channel, used to measure the capacities of other systems and channels. In information theory, units of information are also used to measure information contained in messages and the entropy of random variables.
    en.wikipedia.org/wiki/Units_of_information
    en.wikipedia.org/wiki/Units_of_information
    这是否有帮助?
  2. 其他用户还问了以下问题
  3. 查看更多
    查看更多
    前往 Wikipedia 查看全部内容
    查看更多

    Quantities of information - Wikipedia

    The mathematical theory of information is based on probability theory and statistics, and measures information with several quantities of information. The choice of logarithmic base in the following formulae determines the unit of information entropy that is used. The most common unit of information is the bit, or more … 展开

    Shannon derived a measure of information content called the self-information or "surprisal" of a message $${\displaystyle m}$$:
    where 展开

    It turns out that one of the most useful and important measures of information is the mutual information, or transinformation. This is a measure of how much information can be … 展开

    CC-BY-SA 许可证中的维基百科文本
  4. Units of information - Wikipedia

  5. Quantities of information - Wikiwand

  6. Information theory - Wikipedia

  7. Information theory - Wikiversity

  8. 熵与信息量 | hozen.site

  9. Information theory | Definition, History, Examples, & Facts

  10. Information theory - Simple English Wikipedia, the free encyclopedia

  11. What is information?† | Philosophical Transactions of …

    网页2016年3月13日 · Information is a precise concept that can be defined mathematically, but its relationship to what we call ‘knowledge’ is not always made clear. Furthermore, the concepts ‘entropy’ and ‘information’, while …

  12. Information Quantity - SpringerLink

  13. 某些结果已被删除