<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/">
  <channel>
    <title>Tech on xgDebug的博客</title>
    <link>https://xgdebug.com/zh/posts/tech/</link>
    <description>Recent content in Tech on xgDebug的博客</description>
    
    <generator>Hugo</generator>
    <language>zh-cn</language>
    <lastBuildDate>Sun, 30 Nov 2025 01:39:49 +0000</lastBuildDate>
    <atom:link href="https://xgdebug.com/zh/posts/tech/index.xml" rel="self" type="application/rss+xml" />
    <item>
      <title>OpenWrt双路由DHCPv6-PD前缀委派配置教程</title>
      <link>https://xgdebug.com/zh/posts/tech/openwrt/openwrt-dual-router-dhcpv6-pd-prefix-delegation-tutorial/</link>
      <pubDate>Fri, 07 Nov 2025 11:40:50 +0000</pubDate>
      <guid>https://xgdebug.com/zh/posts/tech/openwrt/openwrt-dual-router-dhcpv6-pd-prefix-delegation-tutorial/</guid>
      <description>详解如何通过DHCPv6-PD在OpenWrt路由器级联环境下获取并分配IPv6公网前缀，包括WAN/LAN接口、RA/DHCPv6服务、防火墙与验证排错。</description>
      <content:encoded><![CDATA[<h1 id="核心方案配置openwrt_"><strong>核心方案：配置OpenWRT_B通过DHCPv6-PD获取子前缀</strong></h1>
<p><strong>核心关键是要给对应的接口分配一个合适的前缀长度，并确保DHCPv6和RA服务正确启用。,最小也要63,才能给下一级路由器分配64位子网</strong></p>
<h2 id="第一步配置openwrt_"><strong>第一步：配置OpenWRT_B的WAN口（连接OpenWRT_A）</strong></h2>
<p>进入OpenWRT_B管理界面 → <strong>网络</strong> → <strong>接口</strong></p>
<ol>
<li><strong>编辑WAN接口</strong>（物理接口对应连接OpenWRT_A的网口）</li>
<li><strong>协议</strong>：选择  <strong>&ldquo;DHCPv6客户端&rdquo;</strong>  （如果原来是静态或禁用）</li>
<li><strong>切换到&quot;高级设置&quot;标签页</strong>：
<ul>
<li><strong>&ldquo;请求IPv6前缀&rdquo;</strong>  ：勾选 ✓</li>
<li><strong>&ldquo;请求的IPv6前缀长度&rdquo;</strong>  ：选择  <strong>&ldquo;自动&rdquo;</strong>  或手动填写  <strong><code>60</code></strong></li>
</ul>
</li>
<li><strong>切换到&quot;物理设置&quot;标签页</strong>：确认接口绑定正确（如eth0.2或lan口）</li>
<li><strong>保存</strong></li>
</ol>
<h2 id="第二步配置openwrt_"><strong>第二步：配置OpenWRT_B的LAN口（连接你的电脑）</strong></h2>
<p>仍在<strong>网络</strong> → <strong>接口</strong> → <strong>编辑LAN接口</strong></p>
<ol>
<li>
<p><strong>切换到&quot;常规设置&quot;标签页</strong>：</p>
<ul>
<li><strong>IPv6分配长度</strong>：选择  <strong><code>63</code></strong>  （关键！）</li>
</ul>
</li>
<li>
<p><strong>切换到&quot;IPv6设置&quot;标签页</strong>：</p>
<ul>
<li><strong>&ldquo;IPv6 assignment hint&rdquo;</strong>  ：填写一个子网ID（如 <strong><code>1</code></strong> 或 <strong><code>2</code></strong>，避免与OpenWRT_A的LAN冲突）</li>
<li><strong>&ldquo;IPv6后缀&rdquo;</strong>  ：可留空或设置为  <strong><code>::1</code></strong></li>
<li><strong>RA服务</strong>：选择  <strong>&ldquo;服务器模式&rdquo;</strong></li>
<li><strong>DHCPv6服务</strong>：选择  <strong>&ldquo;服务器模式&rdquo;</strong></li>
<li><strong>NDP代理</strong>：选择  <strong>&ldquo;已禁用&rdquo;</strong></li>
<li><strong>RA管理</strong>：选择  <strong>&ldquo;已启用&rdquo;</strong></li>
<li><strong>Always announce default router</strong>：勾选 ✓</li>
</ul>
</li>
<li>
<p><strong>保存并应用</strong></p>
</li>
</ol>
<h2 id="第三步检查dhcpv6服务器配置"><strong>第三步：检查DHCPv6服务器配置</strong></h2>
<p>进入 <strong>服务</strong> → <strong>DHCP/DNS</strong></p>
<ol>
<li><strong>切换到&quot;高级设置&quot;标签页</strong>：
<ul>
<li>确保 <strong>&ldquo;禁止解析IPv6 DNS记录&rdquo;</strong>  <strong>未勾选</strong></li>
</ul>
</li>
<li><strong>切换到&quot;IPv6 RA设置&quot;标签页</strong>：
<ul>
<li><strong>RA Flags</strong>：勾选 <strong><code>managed</code></strong> 和 <strong><code>other</code></strong></li>
</ul>
</li>
<li><strong>保存并应用</strong></li>
</ol>
<h2 id="第四步调整防火墙关键"><strong>第四步：调整防火墙（关键）</strong></h2>
<p>进入 <strong>网络</strong> → <strong>防火墙</strong></p>
<ol>
<li>
<p><strong>编辑&quot;wan&quot;区域</strong>：</p>
<ul>
<li><strong>转发</strong>：选择 <strong>&ldquo;接受&rdquo;</strong> 或确保有规则允许转发到lan区域</li>
<li><strong>涵盖的网络</strong>：确认包含 <code>wan</code> 和 <code>wan6</code></li>
</ul>
</li>
<li>
<p><strong>编辑&quot;lan&quot;区域</strong>：</p>
<ul>
<li><strong>转发</strong>：选择  <strong>&ldquo;接受&rdquo;</strong></li>
<li><strong>涵盖的网络</strong>：确认包含 <code>lan</code></li>
</ul>
</li>
<li>
<p><strong>保存并应用</strong></p>
</li>
</ol>
<h1 id="验证与排错"><strong>验证与排错</strong></h1>
<h2 id="在openwrt_"><strong>在OpenWRT_B上执行以下命令：</strong></h2>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl"><span class="c1"># 检查是否成功获取PD</span>
</span></span><span class="line"><span class="cl">ifstatus wan6
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># 查看LAN口IPv6地址</span>
</span></span><span class="line"><span class="cl">ifconfig br-lan <span class="p">|</span> grep inet6
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># 查看路由表</span>
</span></span><span class="line"><span class="cl">ip -6 route
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># 查看DHCPv6服务器状态</span>
</span></span><span class="line"><span class="cl">logread <span class="p">|</span> grep dhcp6
</span></span></code></pre></div><p><strong>成功标志</strong>：</p>
<ul>
<li><code>ifstatus wan6</code> 应显示获取的IPv6-PD（如 <code>240e:39c:2bae:7001::/60</code>）</li>
<li>LAN口应有公网IPv6地址（如 <code>240e:39c:2bae:7001::1/64</code>）</li>
<li>路由表中有 <code>240e:39c:2bae:7000::/56</code> 的默认路由指向pppoe-wan</li>
</ul>
<h2 id="在电脑上验证"><strong>在电脑上验证：</strong></h2>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl"><span class="c1"># Windows</span>
</span></span><span class="line"><span class="cl">ipconfig /all <span class="p">|</span> findstr IPv6
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># Linux/macOS</span>
</span></span><span class="line"><span class="cl">ifconfig <span class="p">|</span> grep inet6
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># 测试连通性</span>
</span></span><span class="line"><span class="cl">ping -6 ipv6.google.com
</span></span></code></pre></div><h1 id="常见问题与解决"><strong>常见问题与解决</strong></h1>
<table>
  <thead>
      <tr>
          <th>问题</th>
          <th>原因</th>
          <th>解决方案</th>
      </tr>
  </thead>
  <tbody>
      <tr>
          <td>OpenWRT_B获取不到PD</td>
          <td>OpenWRT_A未正确分配PD前缀</td>
          <td>在OpenWRT_A的WAN6接口高级设置中，将&quot;IPv6前缀长度&quot;改为<code>60</code>或<code>56</code></td>
      </tr>
      <tr>
          <td>电脑获取不到地址</td>
          <td>RA/DHCPv6服务未启用或防火墙阻拦</td>
          <td>检查LAN口的IPv6设置为&quot;服务器模式&quot;，并确认防火墙允许转发</td>
      </tr>
      <tr>
          <td>能获取地址但无法上网</td>
          <td>缺少默认路由或DNS</td>
          <td>确保RA设置中&quot;Always announce default router&quot;已勾选，并检查DNS配置</td>
      </tr>
      <tr>
          <td>IPv6地址冲突</td>
          <td>子网ID与OpenWRT_A冲突</td>
          <td>修改OpenWRT_B LAN的&quot;IPv6 assignment hint&quot;为其他值（1,2,3&hellip;）</td>
      </tr>
  </tbody>
</table>
<h1 id="推荐配置总结"><strong>推荐配置总结</strong></h1>
<p><strong>OpenWRT_A</strong>（确保）：</p>
<ul>
<li>WAN6接口 → 高级设置 → IPv6前缀长度：<code>60</code> 或 <code>56</code></li>
<li>DHCP/DNS → 高级设置 → 勾选&quot;动态DHCP&quot;和&quot;RA&quot;</li>
</ul>
<p><strong>OpenWRT_B</strong>（关键）：</p>
<ul>
<li>WAN接口：DHCPv6客户端 + 请求IPv6前缀</li>
<li>LAN接口：IPv6分配长度64 + RA/DHCPv6服务器模式</li>
<li>防火墙：允许wan到lan的IPv6转发</li>
</ul>
<p>配置完成后，重启OpenWRT_B的WAN接口或整个路由器，等待1-2分钟让DHCPv6完成前缀协商，你的电脑应该就能获取到公网IPv6地址了。</p>
]]></content:encoded>
    </item>
    <item>
      <title>OpenWRT配置IPV6中继</title>
      <link>https://xgdebug.com/zh/posts/tech/openwrt/openwrt-ipv6-relay-configuration/</link>
      <pubDate>Sun, 26 Oct 2025 14:03:45 +0000</pubDate>
      <guid>https://xgdebug.com/zh/posts/tech/openwrt/openwrt-ipv6-relay-configuration/</guid>
      <description>为了跟进时代，本文介绍了如何在OpenWRT上配置二级路由的IPV6中继模式，使内网设备也能获取IPV6地址。</description>
      <content:encoded><![CDATA[<p>为了跟进一下时代，尝尝IPV6的鲜，我打算给我的二级内网搞出IPV6来</p>
<p>因为我是家网络有两级，一级是主路由，他可以获取到由运营商分配的IPV6和IPV6-PD，并且可以给接入他的设备分配一个公网IPV6；还有一级时我书房的，他只能自己获取到一个公网IPV6，给接入设备的却只有一个内网的IPV6</p>
<p>打开二级路由 OpenWRT设置–&gt;接口–&gt;WAN6&ndash;&gt;DHCP服务器–&gt;IPV6设置</p>
<p>把路由通告服务、DHCPv6 服务、NDP 代理全部设置为中继模式，并且勾上选项主</p>
<pre tabindex="0"><code>Designated master 打勾

Set this interface as master for RA and DHCPv6 relaying as well as NDP proxying.
RA-Service relay mode

Configures the operation mode of the RA service on this interface.
DHCPv6-Service relay mode

Configures the operation mode of the DHCPv6 service on this interface.
NDP-Proxy relay mode
</code></pre><p>然后打开OpenWRT设置–&gt;接口–&gt;LAN-&gt;DHCP服务器–&gt;IPV6设置</p>
<p>把路由通告服务、DHCPv6 服务、NDP 代理全部设置为中继模式，注意此时不勾 Designated master</p>
<p>最后全部保存并应用，然后再次测试IPV6，发现IPV6已经可以正常获取到了</p>
<p>最终 uci</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">root@OpenWRT-MY:~# uci show dhcp.lan
</span></span><span class="line"><span class="cl">dhcp.lan<span class="o">=</span>dhcp
</span></span><span class="line"><span class="cl">dhcp.lan.interface<span class="o">=</span><span class="s1">&#39;lan&#39;</span>
</span></span><span class="line"><span class="cl">dhcp.lan.start<span class="o">=</span><span class="s1">&#39;100&#39;</span>
</span></span><span class="line"><span class="cl">dhcp.lan.limit<span class="o">=</span><span class="s1">&#39;150&#39;</span>
</span></span><span class="line"><span class="cl">dhcp.lan.leasetime<span class="o">=</span><span class="s1">&#39;12h&#39;</span>
</span></span><span class="line"><span class="cl">dhcp.lan.dhcpv4<span class="o">=</span><span class="s1">&#39;server&#39;</span>
</span></span><span class="line"><span class="cl">dhcp.lan.ra<span class="o">=</span><span class="s1">&#39;relay&#39;</span>
</span></span><span class="line"><span class="cl">dhcp.lan.dhcpv6<span class="o">=</span><span class="s1">&#39;relay&#39;</span>
</span></span><span class="line"><span class="cl">dhcp.lan.ndp<span class="o">=</span><span class="s1">&#39;relay&#39;</span>
</span></span><span class="line"><span class="cl">root@OpenWRT-MY:~# uci show dhcp.wan6
</span></span><span class="line"><span class="cl">dhcp.wan6<span class="o">=</span>dhcp
</span></span><span class="line"><span class="cl">dhcp.wan6.interface<span class="o">=</span><span class="s1">&#39;wan6&#39;</span>
</span></span><span class="line"><span class="cl">dhcp.wan6.ignore<span class="o">=</span><span class="s1">&#39;1&#39;</span>
</span></span><span class="line"><span class="cl">dhcp.wan6.master<span class="o">=</span><span class="s1">&#39;1&#39;</span>
</span></span><span class="line"><span class="cl">dhcp.wan6.ra<span class="o">=</span><span class="s1">&#39;relay&#39;</span>
</span></span><span class="line"><span class="cl">dhcp.wan6.dhcpv6<span class="o">=</span><span class="s1">&#39;relay&#39;</span>
</span></span><span class="line"><span class="cl">dhcp.wan6.ndp<span class="o">=</span><span class="s1">&#39;relay&#39;</span>
</span></span></code></pre></div>]]></content:encoded>
    </item>
    <item>
      <title>人性弱点中的开发者机会</title>
      <link>https://xgdebug.com/zh/posts/tech/app/developer-opportunities-in-human-weaknesses/</link>
      <pubDate>Sat, 12 Jul 2025 08:41:37 +0000</pubDate>
      <guid>https://xgdebug.com/zh/posts/tech/app/developer-opportunities-in-human-weaknesses/</guid>
      <description>本文探讨了个人开发者应遵循的核心原则，并深入解析了驱动人类消费行为的10条人性密码，揭示其背后的商业逻辑。</description>
      <content:encoded><![CDATA[<h2 id="个人开发者的黄金法则">个人开发者的黄金法则：</h2>
<ul>
<li>切入点要小 (Niche Down): 不要试图做一个平台，而是做一个“插件”、“工具”或“解决方案”。</li>
<li>自动化和SaaS优先: 追求可订阅的、能自动运行的软件即服务（SaaS），实现“睡后收入”。</li>
<li>轻资产运营: 远离需要重度运营、客服和实体库存的模式。</li>
<li>AI赋能: 在2025年的今天，AI是你作为个人开发者最强大的“杠杆”，能让你一个人完成过去一个团队的工作。</li>
</ul>
<h2 id="10条人性密码">10条人性密码</h2>
<p><strong>1. 男人好色的钱</strong></p>
<ul>
<li><strong>人性密码：</strong> 繁衍本能、对吸引力的追求、社会地位的彰显。</li>
<li><strong>商业逻辑：</strong> 这不仅限于情色产业，更广泛地体现在社交APP、高端汽车、名牌手表、健身和时尚领域。所有能增强男性魅力和地位象征的商品，都利用了这一原始驱动力。</li>
</ul>
<p><strong>2. 女人爱美的钱</strong></p>
<ul>
<li><strong>人性密码：</strong> 追求社会认同、提升自我价值、对抗衰老焦虑。</li>
<li><strong>商业逻辑：</strong> 从护肤美妆、医美整形到时装珠宝，这是一个永不枯竭的市场。它深度绑定了女性的心理满足感，通过制造“更美”的承诺，创造出无限的消费需求。</li>
</ul>
<p><strong>3. 穷人暴富的钱</strong></p>
<ul>
<li><strong>人性密码：</strong> 对改变命运的极致渴望、对“捷径”的幻想、侥幸心理。</li>
<li><strong>商业逻辑：</strong> 彩票、赌博、高风险金融衍生品、部分传销模式……它们贩卖的不是商品，而是“一夜暴富”的梦想。这种梦想的诱惑力之大，足以让最理性的人也为之动容。</li>
</ul>
<p><strong>4. 富人怕死的钱</strong></p>
<ul>
<li><strong>人性密码：</strong> 对生命和健康的终极恐惧，以及对现有财富和地位的守护。</li>
<li><strong>商业逻辑：</strong> 当财富达到一定量级，生命便成为最宝贵的资产。这催生了天价的医疗服务、生命延续技术、高端保健品和全方位的安保服务。这是一个客户价格不敏感的“无限预算”市场。</li>
</ul>
<p><strong>5. 孩子教育的钱</strong></p>
<ul>
<li><strong>人性密码：</strong> 父母之爱、阶层跨越的焦虑、未来的不确定性。</li>
<li><strong>商业逻辑：</strong> “不能让孩子输在起跑线上”这句话，撬动了万亿市场。从学区房、私立学校到各类辅导班和兴趣班，父母愿意为子女的未来付出一切。这是一种混杂着爱与焦虑的、几乎无法拒绝的消费。</li>
</ul>
<p><strong>6. 老人健康的钱</strong></p>
<ul>
<li><strong>人性密码：</strong> 对生活质量的渴求、对衰老和病痛的恐惧、对生存时间的延长。</li>
<li><strong>商业逻辑：</strong> 随着全球老龄化，这个市场的潜力日益凸显。保健品、医疗器械、特效药、养老服务……一切能缓解病痛、提升晚年生活质量的产品，都拥有着庞大且刚性的需求。</li>
</ul>
<p><strong>7. 懒人想省事的钱 (The Currency of Convenience)</strong></p>
<ul>
<li><strong>人性密码：</strong> 追求舒适、规避复杂、用金钱换取时间。</li>
<li><strong>商业逻辑：</strong> 外卖平台、网约车、预制菜、各种上门服务……现代商业的巨大成功，很大程度上建立在“让用户更懒”之上。通过提供极致的便利，商家可以收取不菲的溢价，并让消费者形成依赖。</li>
</ul>
<p><strong>8. 体面人想被认同的钱 (The Currency of Status and Virtue)</strong></p>
<ul>
<li><strong>人性密码：</strong> 渴望被尊重、被认同，通过消费来彰显身份、品味和道德立场。</li>
<li><strong>商业逻辑：</strong> 这分为两个层面。一是通过奢侈品、豪宅名车来展示“我很富有”；二是通过购买环保产品、支持良心品牌来展示“我很善良”。无论是物质的炫耀还是道德的优越感，人们都愿意为其支付高昂的“身份税”。</li>
</ul>
<p><strong>9. 无聊人想娱乐的钱 (The Currency of Escapism)</strong></p>
<ul>
<li><strong>人性密码：</strong> 对抗平凡、寻求刺激、填补精神空虚。</li>
<li><strong>商业逻辑：</strong> 游戏、短视频、直播打赏、偶像经济……这些行业精通于设计令人上瘾的“心流”体验和即时反馈机制。它们创造了一个可以逃避现实的虚拟世界，让用户心甘情愿地投入时间和金钱。</li>
</ul>
<p><strong>10. 普通人想安心的钱 (The Currency of Security)</strong></p>
<ul>
<li><strong>人性密码：</strong> 在充满不确定性的世界里，寻找稳定和可控的感觉。</li>
<li><strong>商业逻辑：</strong> 从保险、理财产品，到网络安全、数据储存、家庭安防，所有贩卖“安全感”的生意都大行其道。商家通过适度放大潜在风险，再提供所谓的“解决方案”，精准地满足了人们对确定性的渴望。</li>
</ul>
]]></content:encoded>
    </item>
    <item>
      <title>超调试器</title>
      <link>https://xgdebug.com/zh/posts/tech/debug/hyperdbg/</link>
      <pubDate>Thu, 30 Jan 2025 10:31:52 +0800</pubDate>
      <guid>https://xgdebug.com/zh/posts/tech/debug/hyperdbg/</guid>
      <description>&lt;h2 id=&#34;设置环境&#34;&gt;设置环境&lt;/h2&gt;
&lt;p&gt;.debug remote namedpipe \.\pipe\HyperDbgPipe&lt;/p&gt;
&lt;p&gt;.debug prepare serial 115200 com1&lt;/p&gt;
&lt;p&gt;.sympath SRV&lt;em&gt;c:\Symbols&lt;/em&gt;&lt;a href=&#34;https://msdl.microsoft.com/download/symbols&#34;&gt;https://msdl.microsoft.com/download/symbols&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;.sym download
.sym reload&lt;/p&gt;
&lt;p&gt;.process list&lt;/p&gt;
&lt;p&gt;.thread list process 0000000015e2b000&lt;/p&gt;
&lt;h2 id=&#34;切换进程&#34;&gt;切换进程&lt;/h2&gt;
&lt;p&gt;.process list
.process pid 1394
g&lt;/p&gt;
&lt;p&gt;当调试器停止时，使用 &lt;code&gt;.process&lt;/code&gt; 来检查进程&lt;/p&gt;</description>
      <content:encoded><![CDATA[<h2 id="设置环境">设置环境</h2>
<p>.debug remote namedpipe \.\pipe\HyperDbgPipe</p>
<p>.debug prepare serial 115200 com1</p>
<p>.sympath SRV<em>c:\Symbols</em><a href="https://msdl.microsoft.com/download/symbols">https://msdl.microsoft.com/download/symbols</a></p>
<p>.sym download
.sym reload</p>
<p>.process list</p>
<p>.thread list process 0000000015e2b000</p>
<h2 id="切换进程">切换进程</h2>
<p>.process list
.process pid 1394
g</p>
<p>当调试器停止时，使用 <code>.process</code> 来检查进程</p>
]]></content:encoded>
    </item>
    <item>
      <title>怎么推广APP呢？</title>
      <link>https://xgdebug.com/zh/posts/tech/app/how-to-promote-app/</link>
      <pubDate>Tue, 19 Nov 2024 14:58:38 +0800</pubDate>
      <guid>https://xgdebug.com/zh/posts/tech/app/how-to-promote-app/</guid>
      <description>&lt;p&gt;这是个比较难搞的问题，目前想到的都有。&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;youtube教程下面回帖&lt;/li&gt;
&lt;li&gt;blog教程回帖&lt;/li&gt;
&lt;li&gt;搜索所有相关内容回帖&lt;/li&gt;
&lt;li&gt;上x发帖&lt;/li&gt;
&lt;/ul&gt;</description>
      <content:encoded><![CDATA[<p>这是个比较难搞的问题，目前想到的都有。</p>
<ul>
<li>youtube教程下面回帖</li>
<li>blog教程回帖</li>
<li>搜索所有相关内容回帖</li>
<li>上x发帖</li>
</ul>
]]></content:encoded>
    </item>
    <item>
      <title>Hugo移动端项目分析</title>
      <link>https://xgdebug.com/zh/posts/tech/app/hugo-mobile-project-analysis/</link>
      <pubDate>Sat, 09 Nov 2024 13:05:38 +0800</pubDate>
      <guid>https://xgdebug.com/zh/posts/tech/app/hugo-mobile-project-analysis/</guid>
      <description>&lt;h2 id=&#34;痛点&#34;&gt;痛点&lt;/h2&gt;
&lt;p&gt;Hugo 是一款很好用的静态网站生成框架,很多人用它来写 Blog,并且部署在 github pages 上，但是这个方案只能在电脑上使用，每次要写文章都需要打开电脑。 我强烈的需要一个时刻都能记录自己思考的方案,在思绪飞扬时打开手机就能马上把它记录下来。所以也就需要一款能在手机上操作的 APP。&lt;/p&gt;</description>
      <content:encoded><![CDATA[<h2 id="痛点">痛点</h2>
<p>Hugo 是一款很好用的静态网站生成框架,很多人用它来写 Blog,并且部署在 github pages 上，但是这个方案只能在电脑上使用，每次要写文章都需要打开电脑。 我强烈的需要一个时刻都能记录自己思考的方案,在思绪飞扬时打开手机就能马上把它记录下来。所以也就需要一款能在手机上操作的 APP。</p>
<h2 id="市场调查">市场调查</h2>
<p>在动手之前先进行项目评估调查，看用户付费意愿。
Fake it till you make it .</p>
<h2 id="实现方法">实现方法</h2>
<ol>
<li>移动端带上 git 与 openssl 直接本地操作</li>
<li>服务器 git clone 仓库，使用 hugo 解析，提供 API 调用。包括目录，分类，Tags 等</li>
</ol>
<h2 id="又犯了相同的错误">又犯了相同的错误</h2>
<p>没有做任何市场调查，直接开始开发原型，虽然进展很快，但是我知道这其实是一种偷懒。开发对我来说是最简单的事。这次有一点不一样，那就是我有实际的需求，所以至少让我先用上吧，我自己都不会用的产品是不会有人用的。</p>
<h3 id="2024-10-26">2024-10-26</h3>
<ul>
<li>编辑与新建文章搞定了</li>
<li>优化了一下文章加载太慢的问题，主要是先把控件画出来然后再填内容</li>
<li>目前还差提交与推送到服务器</li>
</ul>
<h3 id="2024-10-27">2024-10-27</h3>
<ul>
<li>尝试了一下android编译，当然只成功了dart部分</li>
<li>搞定了向服务器提交</li>
<li>最后调通了Linux平台下插件交互</li>
</ul>
<h3 id="2024-11-08">2024-11-08</h3>
<ul>
<li>感觉可以发布第一个版本了，勉强还不错吧</li>
<li>后续我觉得还可以加入仓库分析头图加载</li>
<li>下面需要思考一下如何去宣传了</li>
</ul>
<h3 id="2024-11-09">2024-11-09</h3>
<ul>
<li>
<p>计划只显示.md 但是好像无效</p>
</li>
<li>
<p>考虑显示图片</p>
</li>
<li>
<p>上午已经搞定了，上边的两个问题。</p>
</li>
<li>
<p>新bug：分析之后必须重启程序才会有效</p>
</li>
<li>
<p>应该把添加标签放在现有标签的下方</p>
</li>
<li>
<p>完成的第一个版本</p>
</li>
<li>
<p>目前还还差一个landing page</p>
</li>
<li>
<p>接下来需要购买一个域名，为产品本身价做一个Blog</p>
</li>
</ul>
<h3 id="2024-11-11">2024-11-11</h3>
<p>今天发现了一个很大的坑，就是键盘会遮挡输入框下面的工具栏，这个时候需要在最外层包裹一SingleChildscrollView。然后就会在键盘弹出的时候自动把输入框顶上去了。</p>
<ul>
<li>图标被弄丢了</li>
</ul>
<h3 id="2024-11-12">2024-11-12</h3>
<ul>
<li>今天晚上了好多好多的细节，魔鬼都在细节里</li>
<li>明天可以把预览的字体搞大一点，我觉得现在预览的字体太小了。</li>
<li>同步状态也搞出问题来了，明天需要检查一下。主要是同步的时候进度条没有按预想的显示</li>
</ul>
<h3 id="2024-11-12-1">2024-11-12</h3>
<ul>
<li>今天弄好了多语言</li>
<li>考虑把URL主题修改小一点</li>
<li>同步时改变setting的loading状态</li>
</ul>
<h3 id="2024-11-13">2024-11-13</h3>
<p>如果正在同步的时候，那么点击仓库应该直接进入设置的同步页面，而不是进入目录浏览模式。</p>
<h3 id="2024-11-22">2024-11-22</h3>
<ul>
<li>我觉得需要一个最近编辑功能。这样我可以以最快的速度打开最近修改过的文件。</li>
<li>还需要一个全文搜索，不过我不知道用什么技术栈比较好</li>
</ul>
]]></content:encoded>
    </item>
    <item>
      <title>使用 Deploy Key自动部署hugo博客到github</title>
      <link>https://xgdebug.com/zh/posts/tech/linux/deploy-hugo-blog-with-deploy-key/</link>
      <pubDate>Sun, 29 Sep 2024 10:08:49 +0800</pubDate>
      <guid>https://xgdebug.com/zh/posts/tech/linux/deploy-hugo-blog-with-deploy-key/</guid>
      <description>&lt;h2 id=&#34;1生成-ssh-密钥&#34;&gt;1：生成 SSH 密钥&lt;/h2&gt;
&lt;p&gt;在本地终端生成一个新的 SSH 密钥对：&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;ssh-keygen -t rsa -b &lt;span class=&#34;m&#34;&gt;4096&lt;/span&gt; -C &lt;span class=&#34;s2&#34;&gt;&amp;#34;your_email@example.com&amp;#34;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;执行后，它会要求你指定文件名，按回车即可使用默认路径（~/.ssh/id_rsa），然后你会得到两个文件：&lt;/p&gt;</description>
      <content:encoded><![CDATA[<h2 id="1生成-ssh-密钥">1：生成 SSH 密钥</h2>
<p>在本地终端生成一个新的 SSH 密钥对：</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">ssh-keygen -t rsa -b <span class="m">4096</span> -C <span class="s2">&#34;your_email@example.com&#34;</span>
</span></span></code></pre></div><p>执行后，它会要求你指定文件名，按回车即可使用默认路径（~/.ssh/id_rsa），然后你会得到两个文件：</p>
<p>id_rsa（私钥）
id_rsa.pub（公钥）</p>
<h2 id="2在-github-pages-仓库中添加-deploy-key">2：在 GitHub Pages 仓库中添加 Deploy Key</h2>
<p>打开你的 GitHub Pages 仓库 xgDebug/xgdebug.github.io。
进入 Settings -&gt; Deploy keys。
点击 Add deploy key。
title = 取个描述性标题（如 &ldquo;Hugo Blog Deployment Key&rdquo;）。
Key: 将上一步生成的公钥 id_rsa.pub 的内容复制粘贴进去。
勾选 Allow write access，因为需要写权限。
点击 Add key。</p>
<h2 id="3将私钥添加到-github-secrets">3：将私钥添加到 GitHub Secrets</h2>
<p>在你的 Hugo 仓库 xgDebug/xgDebug_blog 中，进入 Settings -&gt; Secrets -&gt; Actions。
点击 New repository secret，创建新的密钥：
Name: DEPLOY_KEY
Value: 将生成的私钥 id_rsa 的内容粘贴进去。</p>
<h2 id="4更新-github-actions-配置">4：更新 GitHub Actions 配置</h2>
<p>在 .github/workflows/gh-pages.yml 中使用 DEPLOY_KEY 进行身份验证：</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-yaml" data-lang="yaml"><span class="line"><span class="cl"><span class="nt">name</span><span class="p">:</span><span class="w"> </span><span class="l">Deploy Hugo Site to GitHub Pages</span><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="nt">on</span><span class="p">:</span><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="nt">push</span><span class="p">:</span><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="nt">branches</span><span class="p">:</span><span class="w"> </span>- <span class="l">master</span><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="nt">jobs</span><span class="p">:</span><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="nt">deploy</span><span class="p">:</span><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="nt">runs-on</span><span class="p">:</span><span class="w"> </span><span class="l">ubuntu-latest</span><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="nt">steps: - name</span><span class="p">:</span><span class="w"> </span><span class="l">Checkout repository</span><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="nt">uses</span><span class="p">:</span><span class="w"> </span><span class="l">actions/checkout@v3</span><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="w">    </span>- <span class="nt">name</span><span class="p">:</span><span class="w"> </span><span class="l">Set up Hugo</span><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="w">      </span><span class="nt">uses</span><span class="p">:</span><span class="w"> </span><span class="l">peaceiris/actions-hugo@v2</span><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="w">      </span><span class="nt">with</span><span class="p">:</span><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="w">        </span><span class="nt">hugo-version</span><span class="p">:</span><span class="w"> </span><span class="s1">&#39;latest&#39;</span><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="w">    </span>- <span class="nt">name</span><span class="p">:</span><span class="w"> </span><span class="l">Install Hugo themes</span><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="w">      </span><span class="nt">run</span><span class="p">:</span><span class="w"> </span><span class="l">git submodule update --init --recursive</span><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="w">    </span>- <span class="nt">name</span><span class="p">:</span><span class="w"> </span><span class="l">Build Hugo site</span><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="w">      </span><span class="nt">run</span><span class="p">:</span><span class="w"> </span><span class="l">hugo --minify</span><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="w">    </span>- <span class="nt">name</span><span class="p">:</span><span class="w"> </span><span class="l">Deploy to GitHub Pages</span><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="w">      </span><span class="nt">uses</span><span class="p">:</span><span class="w"> </span><span class="l">peaceiris/actions-gh-pages@v3</span><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="w">      </span><span class="nt">with</span><span class="p">:</span><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="w">        </span><span class="nt">deploy_key</span><span class="p">:</span><span class="w"> </span><span class="l">${{ secrets.DEPLOY_KEY }}</span><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="w">        </span><span class="nt">publish_dir</span><span class="p">:</span><span class="w"> </span><span class="l">./public</span><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="w">        </span><span class="nt">external_repository</span><span class="p">:</span><span class="w"> </span><span class="l">xgDebug/xgdebug.github.io</span><span class="w">
</span></span></span><span class="line"><span class="cl"><span class="w">        </span><span class="nt">publish_branch</span><span class="p">:</span><span class="w"> </span><span class="l">gh-pages</span><span class="w">
</span></span></span></code></pre></div>]]></content:encoded>
    </item>
    <item>
      <title>使用ncnn布署pytorch模型到Android手机</title>
      <link>https://xgdebug.com/zh/posts/tech/ai/deploy-pytorch-model-to-android-with-ncnn/</link>
      <pubDate>Sat, 28 Sep 2024 05:08:49 +0000</pubDate>
      <guid>https://xgdebug.com/zh/posts/tech/ai/deploy-pytorch-model-to-android-with-ncnn/</guid>
      <description>本文档介绍了如何使用 NCNN 部署 PyTorch 模型到 Android 手机，包括编译 NCNN、训练 YOLO、转换模型以及在不同平台上构建 NCNN 的详细步骤，涵盖了 Linux、Windows、macOS、ARM、Hisilicon、Android、iOS 和 WebAssembly 等多种平台。</description>
      <content:encoded><![CDATA[<h2 id="使用-ncnn-布署-pytorch-模型到-android-手机">使用 ncnn 布署 pytorch 模型到 Android 手机</h2>
<ol>
<li>编译 NCNN 时要打开显卡支持 vulkan 是针对 gpu 的 -DNCNN_VULKAN=ON</li>
<li>MobileNetV3</li>
</ol>
<h2 id="編譯成-mt-時要打開-cmake-0091-特性">編譯成 MT 時要打開 CMAKE 0091 特性</h2>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-cmake" data-lang="cmake"><span class="line"><span class="cl"><span class="nb">cmake_minimum_required</span><span class="p">(</span><span class="s">VERSION</span> <span class="s">3.20.0</span><span class="p">)</span><span class="err">
</span></span></span><span class="line"><span class="cl"><span class="nb">cmake_policy</span><span class="p">(</span><span class="s">SET</span> <span class="s">CMP0091</span> <span class="s">NEW</span><span class="p">)</span><span class="err">
</span></span></span><span class="line"><span class="cl"><span class="nb">set</span><span class="p">(</span><span class="s">CMAKE_MSVC_RUNTIME_LIBRARY</span> <span class="s2">&#34;MultiThreaded$&lt;$&lt;CONFIG:Debug&gt;:Debug&gt;&#34;</span><span class="p">)</span><span class="err">
</span></span></span><span class="line"><span class="cl"><span class="nb">project</span><span class="p">(</span><span class="s2">&#34;client-project&#34;</span><span class="p">)</span><span class="err">
</span></span></span></code></pre></div><h3 id="训练-yolo">训练 YOLO</h3>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl"><span class="se">\E</span>nvs<span class="se">\t</span>orch<span class="se">\S</span>cripts<span class="se">\a</span>ctivate.ps1
</span></span><span class="line"><span class="cl">python train.py --batch <span class="m">6</span> --workers <span class="m">2</span> --imgsz <span class="m">960</span> --epochs <span class="m">300</span> --data <span class="s2">&#34;\Core\yaml\data.yaml&#34;</span> --cfg <span class="s2">&#34;\Core\yaml\cfg.yaml&#34;</span> --weights <span class="se">\C</span>ore<span class="se">\w</span>eights<span class="se">\b</span>est.pt --device <span class="m">0</span>
</span></span></code></pre></div><h4 id="转换模型">转换模型</h4>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-python" data-lang="python"><span class="line"><span class="cl"><span class="kn">from</span> <span class="nn">torch</span> <span class="kn">import</span> <span class="n">nn</span>
</span></span><span class="line"><span class="cl"><span class="kn">import</span> <span class="nn">torch.utils.model_zoo</span> <span class="k">as</span> <span class="nn">model_zoo</span>
</span></span><span class="line"><span class="cl"><span class="kn">import</span> <span class="nn">torch.onnx</span>
</span></span><span class="line"><span class="cl"><span class="kn">from</span> <span class="nn">libs</span> <span class="kn">import</span> <span class="n">define</span>
</span></span><span class="line"><span class="cl"><span class="kn">from</span> <span class="nn">libs.net</span> <span class="kn">import</span> <span class="n">Net</span>
</span></span><span class="line"><span class="cl"><span class="kn">from</span> <span class="nn">libs.dataset</span> <span class="kn">import</span> <span class="n">ImageDataset</span>
</span></span><span class="line"><span class="cl"><span class="kn">import</span> <span class="nn">os</span>
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="n">test_data</span> <span class="o">=</span> <span class="n">ImageDataset</span><span class="p">(</span><span class="n">define</span><span class="o">.</span><span class="n">testPath</span><span class="p">,</span><span class="kc">False</span><span class="p">)</span>
</span></span><span class="line"><span class="cl"><span class="n">test_loader</span> <span class="o">=</span> <span class="n">torch</span><span class="o">.</span><span class="n">utils</span><span class="o">.</span><span class="n">data</span><span class="o">.</span><span class="n">DataLoader</span><span class="p">(</span> <span class="n">test_data</span><span class="p">,</span> <span class="n">batch_size</span><span class="o">=</span><span class="mi">1</span><span class="p">,</span> <span class="n">shuffle</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span>
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="n">device</span> <span class="o">=</span> <span class="n">torch</span><span class="o">.</span><span class="n">device</span><span class="p">(</span><span class="s2">&#34;cuda&#34;</span> <span class="k">if</span> <span class="n">torch</span><span class="o">.</span><span class="n">cuda</span><span class="o">.</span><span class="n">is_available</span><span class="p">()</span> <span class="k">else</span> <span class="s2">&#34;cpu&#34;</span><span class="p">)</span>
</span></span><span class="line"><span class="cl"><span class="n">model</span> <span class="o">=</span> <span class="n">Net</span><span class="p">(</span><span class="n">out_dim</span><span class="o">=</span><span class="mi">19</span><span class="p">)</span><span class="o">.</span><span class="n">to</span><span class="p">(</span><span class="n">device</span><span class="p">)</span>
</span></span><span class="line"><span class="cl"><span class="n">model</span><span class="o">.</span><span class="n">load_state_dict</span><span class="p">(</span><span class="n">torch</span><span class="o">.</span><span class="n">load</span><span class="p">(</span> <span class="s2">&#34;./widget/last.pt&#34;</span> <span class="p">))</span>
</span></span><span class="line"><span class="cl"><span class="n">model</span><span class="o">.</span><span class="n">eval</span><span class="p">()</span>
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="k">def</span> <span class="nf">saveOnnx</span><span class="p">():</span>
</span></span><span class="line"><span class="cl">    <span class="k">for</span> <span class="n">data</span><span class="p">,</span> <span class="n">target</span> <span class="ow">in</span> <span class="n">test_loader</span><span class="p">:</span>
</span></span><span class="line"><span class="cl">        <span class="n">data</span><span class="p">,</span> <span class="n">target</span> <span class="o">=</span> <span class="n">data</span><span class="o">.</span><span class="n">to</span><span class="p">(</span><span class="n">device</span><span class="p">),</span> <span class="n">target</span><span class="o">.</span><span class="n">to</span><span class="p">(</span><span class="n">device</span><span class="p">)</span>
</span></span><span class="line"><span class="cl">        <span class="n">label</span> <span class="o">=</span> <span class="n">target</span><span class="o">.</span><span class="n">long</span><span class="p">()</span>
</span></span><span class="line"><span class="cl">        <span class="n">y</span> <span class="o">=</span> <span class="n">model</span><span class="p">(</span><span class="n">data</span><span class="p">)</span>
</span></span><span class="line"><span class="cl">        <span class="c1"># Export the model</span>
</span></span><span class="line"><span class="cl">        <span class="n">torch</span><span class="o">.</span><span class="n">onnx</span><span class="o">.</span><span class="n">export</span><span class="p">(</span><span class="n">model</span><span class="p">,</span>                   <span class="c1"># model being run</span>
</span></span><span class="line"><span class="cl">                        <span class="n">data</span><span class="p">,</span>                      <span class="c1"># model input (or a tuple for multiple inputs)</span>
</span></span><span class="line"><span class="cl">                        <span class="s2">&#34;./widget/best.onnx&#34;</span><span class="p">,</span>            <span class="c1"># where to save the model (can be a file or file-like object)</span>
</span></span><span class="line"><span class="cl">                        <span class="n">export_params</span><span class="o">=</span><span class="kc">True</span><span class="p">,</span>        <span class="c1"># store the trained parameter weights inside the model file</span>
</span></span><span class="line"><span class="cl">                        <span class="n">opset_version</span><span class="o">=</span><span class="mi">10</span><span class="p">,</span>          <span class="c1"># the ONNX version to export the model to</span>
</span></span><span class="line"><span class="cl">                        <span class="n">do_constant_folding</span><span class="o">=</span><span class="kc">True</span><span class="p">,</span>  <span class="c1"># whether to execute constant folding for optimization</span>
</span></span><span class="line"><span class="cl">                        <span class="n">input_names</span> <span class="o">=</span> <span class="p">[</span><span class="s1">&#39;input&#39;</span><span class="p">],</span>   <span class="c1"># the model&#39;s input names</span>
</span></span><span class="line"><span class="cl">                        <span class="n">output_names</span> <span class="o">=</span> <span class="p">[</span><span class="s1">&#39;output&#39;</span><span class="p">],</span>  <span class="c1"># the model&#39;s output names</span>
</span></span><span class="line"><span class="cl">                        <span class="n">dynamic_axes</span><span class="o">=</span><span class="p">{</span><span class="s1">&#39;input&#39;</span> <span class="p">:</span> <span class="p">{</span><span class="mi">0</span> <span class="p">:</span> <span class="s1">&#39;batch_size&#39;</span><span class="p">},</span>    <span class="c1"># variable lenght axes</span>
</span></span><span class="line"><span class="cl">                                        <span class="s1">&#39;output&#39;</span> <span class="p">:</span> <span class="p">{</span><span class="mi">0</span> <span class="p">:</span> <span class="s1">&#39;batch_size&#39;</span><span class="p">}})</span>
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">        <span class="n">traced_script_module</span> <span class="o">=</span> <span class="n">torch</span><span class="o">.</span><span class="n">jit</span><span class="o">.</span><span class="n">trace</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="n">data</span><span class="p">)</span>
</span></span><span class="line"><span class="cl">        <span class="k">return</span>
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="n">saveOnnx</span><span class="p">()</span>
</span></span><span class="line"><span class="cl"><span class="c1"># 转换</span>
</span></span><span class="line"><span class="cl"><span class="n">os</span><span class="o">.</span><span class="n">system</span><span class="p">(</span><span class="s2">&#34;python -m onnxsim ./widget/best.onnx ./widgetbest-sim.onnx&#34;</span><span class="p">)</span>
</span></span><span class="line"><span class="cl"><span class="n">os</span><span class="o">.</span><span class="n">system</span><span class="p">(</span><span class="s2">&#34;./bin/onnx2ncnn.exe ./widget/best-sim.onnx ./widget/best.param ./widget/best.bin&#34;</span><span class="p">)</span>
</span></span><span class="line"><span class="cl"><span class="n">os</span><span class="o">.</span><span class="n">system</span><span class="p">(</span><span class="s2">&#34;./bin/ncnnoptimize.exe ./widget/best.param ./widget/best.bin ./widget/best-opt.param ./widget/best-opt.bin 65536&#34;</span><span class="p">)</span>
</span></span></code></pre></div><div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">python .<span class="se">\e</span>xport.py --weights weights/best.pt --img <span class="m">960</span> --batch <span class="m">1</span> --train
</span></span><span class="line"><span class="cl">python -m onnxsim best.onnx best-sim.onnx
</span></span><span class="line"><span class="cl">.<span class="se">\o</span>nnx2ncnn.exe best-sim.onnx best.param best.bin
</span></span><span class="line"><span class="cl">ncnnoptimize best.param best.bin best-opt.param best-opt.bin <span class="m">65536</span>
</span></span></code></pre></div><h3 id="git-clone-ncnn-repo-with-submodule">Git clone ncnn repo with submodule</h3>
<pre tabindex="0"><code>$ git clone https://github.com/Tencent/ncnn.git
$ cd ncnn
$ git submodule update --init
</code></pre><ul>
<li><a href="/zh/posts/tech/ai/deploy-pytorch-model-to-android-with-ncnn/#build-for-linux">Build for Linux / NVIDIA Jetson / Raspberry Pi</a></li>
<li><a href="/zh/posts/tech/ai/deploy-pytorch-model-to-android-with-ncnn/#build-for-windows-x64-using-visual-studio-community-2017">Build for Windows x64 using VS2017</a></li>
<li><a href="/zh/posts/tech/ai/deploy-pytorch-model-to-android-with-ncnn/#build-for-macos">Build for macOS</a></li>
<li><a href="/zh/posts/tech/ai/deploy-pytorch-model-to-android-with-ncnn/#build-for-arm-cortex-a-family-with-cross-compiling">Build for ARM Cortex-A family with cross-compiling</a></li>
<li><a href="/zh/posts/tech/ai/deploy-pytorch-model-to-android-with-ncnn/#build-for-hisilicon-platform-with-cross-compiling">Build for Hisilicon platform with cross-compiling</a></li>
<li><a href="/zh/posts/tech/ai/deploy-pytorch-model-to-android-with-ncnn/#build-for-android">Build for Android</a></li>
<li><a href="/zh/posts/tech/ai/deploy-pytorch-model-to-android-with-ncnn/#build-for-ios-on-macos-with-xcode">Build for iOS on macOS with xcode</a></li>
<li><a href="/zh/posts/tech/ai/deploy-pytorch-model-to-android-with-ncnn/#build-for-webassembly">Build for WebAssembly</a></li>
<li><a href="/zh/posts/tech/ai/deploy-pytorch-model-to-android-with-ncnn/#build-for-allwinner-d1">Build for AllWinner D1</a></li>
<li><a href="/zh/posts/tech/ai/deploy-pytorch-model-to-android-with-ncnn/#build-for-loongson-2k1000">Build for Loongson 2K1000</a></li>
<li><a href="/zh/posts/tech/ai/deploy-pytorch-model-to-android-with-ncnn/#Build-for-Termux-on-Android">Build for Termux on Android</a></li>
</ul>
<hr>
<h3 id="build-for-linux">Build for Linux</h3>
<p>Install required build dependencies:</p>
<ul>
<li>git</li>
<li>g++</li>
<li>cmake</li>
<li>protocol buffer (protobuf) headers files and protobuf compiler</li>
<li>vulkan header files and loader library</li>
<li>glslang</li>
<li>(optional) opencv # For building examples</li>
</ul>
<p>Generally if you have Intel, AMD or Nvidia GPU from last 10 years, Vulkan can be easily used.</p>
<p>On some systems there are no Vulkan drivers easily available at the moment (October 2020), so you might need to disable use of Vulkan on them. This applies to Raspberry Pi 3 (but there is experimental open source Vulkan driver in the works, which is not ready yet). Nvidia Tegra series devices (like Nvidia Jetson) should support Vulkan. Ensure you have most recent software installed for best expirience.</p>
<p>On Debian, Ubuntu or Raspberry Pi OS, you can install all required dependencies using:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">sudo apt install build-essential git cmake libprotobuf-dev protobuf-compiler libvulkan-dev vulkan-utils libopencv-dev
</span></span></code></pre></div><p>To use Vulkan backend install Vulkan header files, a vulkan driver loader, GLSL to SPIR-V compiler and vulkaninfo tool. Preferably from your distribution repositories. Alternatively download and install full Vulkan SDK (about 200MB in size; it contains all header files, documentation and prebuilt loader, as well some extra tools and source code of everything) from <a href="https://vulkan.lunarg.com/sdk/home">https://vulkan.lunarg.com/sdk/home</a></p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">wget https://sdk.lunarg.com/sdk/download/1.2.189.0/linux/vulkansdk-linux-x86_64-1.2.189.0.tar.gz?Human<span class="o">=</span><span class="nb">true</span> -O vulkansdk-linux-x86_64-1.2.189.0.tar.gz
</span></span><span class="line"><span class="cl">tar -xf vulkansdk-linux-x86_64-1.2.189.0.tar.gz
</span></span><span class="line"><span class="cl"><span class="nb">export</span> <span class="nv">VULKAN_SDK</span><span class="o">=</span><span class="k">$(</span><span class="nb">pwd</span><span class="k">)</span>/1.2.189.0/x86_64
</span></span></code></pre></div><p>To use Vulkan after building ncnn later, you will also need to have Vulkan driver for your GPU. For AMD and Intel GPUs these can be found in Mesa graphics driver, which usually is installed by default on all distros (i.e. <code>sudo apt install mesa-vulkan-drivers</code> on Debian/Ubuntu). For Nvidia GPUs the proprietary Nvidia driver must be downloaded and installed (some distros will allow easier installation in some way). After installing Vulkan driver, confirm Vulkan libraries and driver are working, by using <code>vulkaninfo</code> or <code>vulkaninfo | grep deviceType</code>, it should list GPU device type. If there are more than one GPU installed (including the case of integrated GPU and discrete GPU, commonly found in laptops), you might need to note the order of devices to use later on.</p>
<p>Nvidia Jetson devices the Vulkan support should be present in Nvidia provided SDK (Jetpack) or prebuild OS images.</p>
<p>Raspberry Pi Vulkan drivers do exists, but are not mature. You are free to experiment at your own discretion, and report results and performance.</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl"><span class="nb">cd</span> ncnn
</span></span><span class="line"><span class="cl">mkdir -p build
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build
</span></span><span class="line"><span class="cl">cmake -DCMAKE_BUILD_TYPE<span class="o">=</span>Release -DNCNN_VULKAN<span class="o">=</span>ON -DNCNN_SYSTEM_GLSLANG<span class="o">=</span>ON -DNCNN_BUILD_EXAMPLES<span class="o">=</span>ON ..
</span></span><span class="line"><span class="cl">make -j<span class="k">$(</span>nproc<span class="k">)</span>
</span></span></code></pre></div><p>You can add <code>-GNinja</code> to <code>cmake</code> above to use Ninja build system (invoke build using <code>ninja</code> or <code>cmake --build .</code>).</p>
<p>For Nvidia Jetson devices, add <code>-DCMAKE_TOOLCHAIN_FILE=../toolchains/jetson.toolchain.cmake</code> to cmake.</p>
<p>For Rasberry Pi 3, add <code>-DCMAKE_TOOLCHAIN_FILE=../toolchains/pi3.toolchain.cmake -DPI3=ON</code> to cmake. You can also consider disabling Vulkan support as the Vulkan drivers for Rasberry Pi are still not mature, but it doesn&rsquo;t hurt to build the support in, but not use it.</p>
<p>Verify build by running some examples:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl"><span class="nb">cd</span> ../examples
</span></span><span class="line"><span class="cl">../build/examples/squeezenet ../images/256-ncnn.png
</span></span><span class="line"><span class="cl"><span class="o">[</span><span class="m">0</span> AMD RADV FIJI <span class="o">(</span>LLVM 10.0.1<span class="o">)]</span>  <span class="nv">queueC</span><span class="o">=</span>1<span class="o">[</span>4<span class="o">]</span>  <span class="nv">queueG</span><span class="o">=</span>0<span class="o">[</span>1<span class="o">]</span>  <span class="nv">queueT</span><span class="o">=</span>0<span class="o">[</span>1<span class="o">]</span>
</span></span><span class="line"><span class="cl"><span class="o">[</span><span class="m">0</span> AMD RADV FIJI <span class="o">(</span>LLVM 10.0.1<span class="o">)]</span>  <span class="nv">bugsbn1</span><span class="o">=</span><span class="m">0</span>  <span class="nv">buglbia</span><span class="o">=</span><span class="m">0</span>  <span class="nv">bugcopc</span><span class="o">=</span><span class="m">0</span>  <span class="nv">bugihfa</span><span class="o">=</span><span class="m">0</span>
</span></span><span class="line"><span class="cl"><span class="o">[</span><span class="m">0</span> AMD RADV FIJI <span class="o">(</span>LLVM 10.0.1<span class="o">)]</span>  <span class="nv">fp16p</span><span class="o">=</span><span class="m">1</span>  <span class="nv">fp16s</span><span class="o">=</span><span class="m">1</span>  <span class="nv">fp16a</span><span class="o">=</span><span class="m">0</span>  <span class="nv">int8s</span><span class="o">=</span><span class="m">1</span>  <span class="nv">int8a</span><span class="o">=</span><span class="m">1</span>
</span></span><span class="line"><span class="cl"><span class="nv">532</span> <span class="o">=</span> 0.163452
</span></span><span class="line"><span class="cl"><span class="nv">920</span> <span class="o">=</span> 0.093140
</span></span><span class="line"><span class="cl"><span class="nv">716</span> <span class="o">=</span> 0.061584
</span></span></code></pre></div><p>You can also run benchmarks (the 4th argument is a GPU device index to use, refer to <code>vulkaninfo</code>, if you have more than one GPU):</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl"><span class="nb">cd</span> ../benchmark
</span></span><span class="line"><span class="cl">../build/benchmark/benchncnn <span class="m">10</span> <span class="k">$(</span>nproc<span class="k">)</span> <span class="m">0</span> <span class="m">0</span>
</span></span><span class="line"><span class="cl"><span class="o">[</span><span class="m">0</span> AMD RADV FIJI <span class="o">(</span>LLVM 10.0.1<span class="o">)]</span>  <span class="nv">queueC</span><span class="o">=</span>1<span class="o">[</span>4<span class="o">]</span>  <span class="nv">queueG</span><span class="o">=</span>0<span class="o">[</span>1<span class="o">]</span>  <span class="nv">queueT</span><span class="o">=</span>0<span class="o">[</span>1<span class="o">]</span>
</span></span><span class="line"><span class="cl"><span class="o">[</span><span class="m">0</span> AMD RADV FIJI <span class="o">(</span>LLVM 10.0.1<span class="o">)]</span>  <span class="nv">bugsbn1</span><span class="o">=</span><span class="m">0</span>  <span class="nv">buglbia</span><span class="o">=</span><span class="m">0</span>  <span class="nv">bugcopc</span><span class="o">=</span><span class="m">0</span>  <span class="nv">bugihfa</span><span class="o">=</span><span class="m">0</span>
</span></span><span class="line"><span class="cl"><span class="o">[</span><span class="m">0</span> AMD RADV FIJI <span class="o">(</span>LLVM 10.0.1<span class="o">)]</span>  <span class="nv">fp16p</span><span class="o">=</span><span class="m">1</span>  <span class="nv">fp16s</span><span class="o">=</span><span class="m">1</span>  <span class="nv">fp16a</span><span class="o">=</span><span class="m">0</span>  <span class="nv">int8s</span><span class="o">=</span><span class="m">1</span>  <span class="nv">int8a</span><span class="o">=</span><span class="m">1</span>
</span></span><span class="line"><span class="cl"><span class="nv">num_threads</span> <span class="o">=</span> <span class="m">4</span>
</span></span><span class="line"><span class="cl"><span class="nv">powersave</span> <span class="o">=</span> <span class="m">0</span>
</span></span><span class="line"><span class="cl"><span class="nv">gpu_device</span> <span class="o">=</span> <span class="m">0</span>
</span></span><span class="line"><span class="cl"><span class="nv">cooling_down</span> <span class="o">=</span> <span class="m">1</span>
</span></span><span class="line"><span class="cl">          squeezenet  <span class="nv">min</span> <span class="o">=</span>    4.68  <span class="nv">max</span> <span class="o">=</span>    4.99  <span class="nv">avg</span> <span class="o">=</span>    4.85
</span></span><span class="line"><span class="cl">     squeezenet_int8  <span class="nv">min</span> <span class="o">=</span>   38.52  <span class="nv">max</span> <span class="o">=</span>   66.90  <span class="nv">avg</span> <span class="o">=</span>   48.52
</span></span><span class="line"><span class="cl">...
</span></span></code></pre></div><p>To run benchmarks on a CPU, set the 5th argument to <code>-1</code>.</p>
<hr>
<h3 id="build-for-windows-x64-using-visual-studio-community-2017">Build for Windows x64 using Visual Studio Community 2017</h3>
<p>Download and Install Visual Studio Community 2017 from <a href="https://visualstudio.microsoft.com/vs/community/">https://visualstudio.microsoft.com/vs/community/</a></p>
<p>Start the command prompt: <code>Start → Programs → Visual Studio 2017 → Visual Studio Tools → x64 Native Tools Command Prompt for VS 2017</code></p>
<p>Download protobuf-3.4.0 from <a href="https://github.com/google/protobuf/archive/v3.4.0.zip">https://github.com/google/protobuf/archive/v3.4.0.zip</a></p>
<p>Build protobuf library:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;protobuf-root-dir&gt;
</span></span><span class="line"><span class="cl">mkdir build
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build
</span></span><span class="line"><span class="cl">cmake -G<span class="s2">&#34;NMake Makefiles&#34;</span> -DCMAKE_BUILD_TYPE<span class="o">=</span>Release -DCMAKE_INSTALL_PREFIX<span class="o">=</span>%cd%/install -Dprotobuf_BUILD_TESTS<span class="o">=</span>OFF -Dprotobuf_MSVC_STATIC_RUNTIME<span class="o">=</span>OFF ../cmake
</span></span><span class="line"><span class="cl">nmake
</span></span><span class="line"><span class="cl">nmake install
</span></span></code></pre></div><p>(optional) Download and install Vulkan SDK from <a href="https://vulkan.lunarg.com/sdk/home">https://vulkan.lunarg.com/sdk/home</a></p>
<p>Build ncnn library (replace <code>protobuf-root-dir</code> with a proper path):</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;ncnn-root-dir&gt;
</span></span><span class="line"><span class="cl">mkdir -p build
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build
</span></span><span class="line"><span class="cl">cmake -G<span class="s2">&#34;NMake Makefiles&#34;</span> -DCMAKE_BUILD_TYPE<span class="o">=</span>Release -DCMAKE_INSTALL_PREFIX<span class="o">=</span>%cd%/install -DProtobuf_INCLUDE_DIR<span class="o">=</span>&lt;protobuf-root-dir&gt;/build/install/include -DProtobuf_LIBRARIES<span class="o">=</span>&lt;protobuf-root-dir&gt;/build/install/lib/libprotobuf.lib -DProtobuf_PROTOC_EXECUTABLE<span class="o">=</span>&lt;protobuf-root-dir&gt;/build/install/bin/protoc.exe -DNCNN_VULKAN<span class="o">=</span>ON ..
</span></span><span class="line"><span class="cl">nmake
</span></span><span class="line"><span class="cl">nmake install
</span></span></code></pre></div><p>Note: To speed up compilation process on multi core machines, configuring <code>cmake</code> to use <code>jom</code> or <code>ninja</code> using <code>-G</code> flag is recommended.</p>
<hr>
<h3 id="build-for-macos">Build for macOS</h3>
<p>First install Xcode or Xcode Command Line Tools according to your needs.</p>
<p>Then install <code>protobuf</code> and <code>libomp</code> via homebrew</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">brew install protobuf libomp
</span></span></code></pre></div><p>Download and install Vulkan SDK from <a href="https://vulkan.lunarg.com/sdk/home">https://vulkan.lunarg.com/sdk/home</a></p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">wget https://sdk.lunarg.com/sdk/download/1.2.189.0/mac/vulkansdk-macos-1.2.189.0.dmg?Human<span class="o">=</span><span class="nb">true</span> -O vulkansdk-macos-1.2.189.0.dmg
</span></span><span class="line"><span class="cl">hdiutil attach vulkansdk-macos-1.2.189.0.dmg
</span></span><span class="line"><span class="cl">sudo /Volumes/vulkansdk-macos-1.2.189.0/InstallVulkan.app/Contents/MacOS/InstallVulkan --root <span class="sb">`</span><span class="nb">pwd</span><span class="sb">`</span>/vulkansdk-macos-1.2.189.0 --accept-licenses --default-answer --confirm-command install
</span></span><span class="line"><span class="cl">hdiutil detach /Volumes/vulkansdk-macos-1.2.189.0
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># setup env</span>
</span></span><span class="line"><span class="cl"><span class="nb">export</span> <span class="nv">VULKAN_SDK</span><span class="o">=</span><span class="sb">`</span><span class="nb">pwd</span><span class="sb">`</span>/vulkansdk-macos-1.2.189.0/macOS
</span></span></code></pre></div><div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;ncnn-root-dir&gt;
</span></span><span class="line"><span class="cl">mkdir -p build
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake -DCMAKE_OSX_ARCHITECTURES<span class="o">=</span><span class="s2">&#34;x86_64;arm64&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DVulkan_INCLUDE_DIR<span class="o">=</span><span class="sb">`</span><span class="nb">pwd</span><span class="sb">`</span>/../vulkansdk-macos-1.2.189.0/MoltenVK/include <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DVulkan_LIBRARY<span class="o">=</span><span class="sb">`</span><span class="nb">pwd</span><span class="sb">`</span>/../vulkansdk-macos-1.2.189.0/MoltenVK/dylib/macOS/libMoltenVK.dylib <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DNCNN_VULKAN<span class="o">=</span>ON -DNCNN_BUILD_EXAMPLES<span class="o">=</span>ON ..
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake --build . -j <span class="m">4</span>
</span></span><span class="line"><span class="cl">cmake --build . --target install
</span></span></code></pre></div><p><em>Note: If you encounter <code>libomp</code> related errors during installation, you can also check our GitHub Actions at <a href="https://github.com/Tencent/ncnn/blob/d91cccf/.github/workflows/macos-x64-gpu.yml#L50-L68">here</a> to install and use <code>openmp</code>.</em></p>
<hr>
<h3 id="build-for-arm-cortex-a-family-with-cross-compiling">Build for ARM Cortex-A family with cross-compiling</h3>
<p>Download ARM toolchain from <a href="https://developer.arm.com/open-source/gnu-toolchain/gnu-a/downloads">https://developer.arm.com/open-source/gnu-toolchain/gnu-a/downloads</a></p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl"><span class="nb">export</span> <span class="nv">PATH</span><span class="o">=</span><span class="s2">&#34;&lt;your-toolchain-compiler-path&gt;:</span><span class="si">${</span><span class="nv">PATH</span><span class="si">}</span><span class="s2">&#34;</span>
</span></span></code></pre></div><p>Alternatively install a cross-compiler provided by the distribution (i.e. on Debian / Ubuntu, you can do <code>sudo apt install g++-arm-linux-gnueabi g++-arm-linux-gnueabihf g++-aarch64-linux-gnu</code>).</p>
<p>Depending on your needs build one or more of the below targets.</p>
<p>AArch32 target with soft float (arm-linux-gnueabi)</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;ncnn-root-dir&gt;
</span></span><span class="line"><span class="cl">mkdir -p build-arm-linux-gnueabi
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-arm-linux-gnueabi
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/arm-linux-gnueabi.toolchain.cmake ..
</span></span><span class="line"><span class="cl">make -j<span class="k">$(</span>nproc<span class="k">)</span>
</span></span></code></pre></div><p>AArch32 target with hard float (arm-linux-gnueabihf)</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;ncnn-root-dir&gt;
</span></span><span class="line"><span class="cl">mkdir -p build-arm-linux-gnueabihf
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-arm-linux-gnueabihf
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/arm-linux-gnueabihf.toolchain.cmake ..
</span></span><span class="line"><span class="cl">make -j<span class="k">$(</span>nproc<span class="k">)</span>
</span></span></code></pre></div><p>AArch64 GNU/Linux target (aarch64-linux-gnu)</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;ncnn-root-dir&gt;
</span></span><span class="line"><span class="cl">mkdir -p build-aarch64-linux-gnu
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-aarch64-linux-gnu
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/aarch64-linux-gnu.toolchain.cmake ..
</span></span><span class="line"><span class="cl">make -j<span class="k">$(</span>nproc<span class="k">)</span>
</span></span></code></pre></div><hr>
<h3 id="build-for-hisilicon-platform-with-cross-compiling">Build for Hisilicon platform with cross-compiling</h3>
<p>Download and install Hisilicon SDK. The toolchain should be in <code>/opt/hisi-linux/x86-arm</code></p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;ncnn-root-dir&gt;
</span></span><span class="line"><span class="cl">mkdir -p build
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># Choose one cmake toolchain file depends on your target platform</span>
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/hisiv300.toolchain.cmake ..
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/hisiv500.toolchain.cmake ..
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/himix100.toolchain.cmake ..
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/himix200.toolchain.cmake ..
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">make -j<span class="k">$(</span>nproc<span class="k">)</span>
</span></span><span class="line"><span class="cl">make install
</span></span></code></pre></div><hr>
<h3 id="build-for-android">Build for Android</h3>
<p>You can use the pre-build ncnn-android-lib.zip from <a href="https://github.com/Tencent/ncnn/releases">https://github.com/Tencent/ncnn/releases</a></p>
<p>Download Android NDK from <a href="http://developer.android.com/ndk/downloads/index.html">http://developer.android.com/ndk/downloads/index.html</a> and install it, for example:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">unzip android-ndk-r21d-linux-x86_64.zip
</span></span><span class="line"><span class="cl"><span class="nb">export</span> <span class="nv">ANDROID_NDK</span><span class="o">=</span>&lt;your-ndk-root-path&gt;
</span></span></code></pre></div><p>(optional) remove the hardcoded debug flag in Android NDK <a href="https://github.com/android-ndk/ndk/issues/243">android-ndk issue</a></p>
<pre tabindex="0"><code># open $ANDROID_NDK/build/cmake/android.toolchain.cmake
# delete &#34;-g&#34; line
list(APPEND ANDROID_COMPILER_FLAGS
  -g
  -DANDROID
</code></pre><p>Build armv7 library</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;ncnn-root-dir&gt;
</span></span><span class="line"><span class="cl">mkdir -p build-android-armv7
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-android-armv7
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span><span class="s2">&#34;</span><span class="nv">$ANDROID_NDK</span><span class="s2">/build/cmake/android.toolchain.cmake&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DANDROID_ABI<span class="o">=</span><span class="s2">&#34;armeabi-v7a&#34;</span> -DANDROID_ARM_NEON<span class="o">=</span>ON <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DANDROID_PLATFORM<span class="o">=</span>android-14 ..
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># If you want to enable Vulkan, platform api version &gt;= android-24 is needed</span>
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span><span class="s2">&#34;</span><span class="nv">$ANDROID_NDK</span><span class="s2">/build/cmake/android.toolchain.cmake&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl">  -DANDROID_ABI<span class="o">=</span><span class="s2">&#34;armeabi-v7a&#34;</span> -DANDROID_ARM_NEON<span class="o">=</span>ON <span class="se">\
</span></span></span><span class="line"><span class="cl">  -DANDROID_PLATFORM<span class="o">=</span>android-24 -DNCNN_VULKAN<span class="o">=</span>ON ..
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">make -j<span class="k">$(</span>nproc<span class="k">)</span>
</span></span><span class="line"><span class="cl">make install
</span></span></code></pre></div><p>Pick <code>build-android-armv7/install</code> folder for further JNI usage.</p>
<p>Build aarch64 library:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;ncnn-root-dir&gt;
</span></span><span class="line"><span class="cl">mkdir -p build-android-aarch64
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-android-aarch64
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span><span class="s2">&#34;</span><span class="nv">$ANDROID_NDK</span><span class="s2">/build/cmake/android.toolchain.cmake&#34;</span><span class="se">\
</span></span></span><span class="line"><span class="cl">  -DANDROID_ABI<span class="o">=</span><span class="s2">&#34;arm64-v8a&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl">  -DANDROID_PLATFORM<span class="o">=</span>android-21 ..
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># If you want to enable Vulkan, platform api version &gt;= android-24 is needed</span>
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span><span class="s2">&#34;</span><span class="nv">$ANDROID_NDK</span><span class="s2">/build/cmake/android.toolchain.cmake&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl">  -DANDROID_ABI<span class="o">=</span><span class="s2">&#34;arm64-v8a&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl">  -DANDROID_PLATFORM<span class="o">=</span>android-24 -DNCNN_VULKAN<span class="o">=</span>ON ..
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">make -j<span class="k">$(</span>nproc<span class="k">)</span>
</span></span><span class="line"><span class="cl">make install
</span></span></code></pre></div><p>Pick <code>build-android-aarch64/install</code> folder for further JNI usage.</p>
<hr>
<h3 id="build-for-ios-on-macos-with-xcode">Build for iOS on macOS with xcode</h3>
<p>You can use the pre-build ncnn.framework glslang.framework and openmp.framework from <a href="https://github.com/Tencent/ncnn/releases">https://github.com/Tencent/ncnn/releases</a></p>
<p>Install xcode</p>
<p>You can replace <code>-DENABLE_BITCODE=0</code> to <code>-DENABLE_BITCODE=1</code> in the following cmake arguments if you want to build bitcode enabled libraries.</p>
<p>Download and install openmp for multithreading inference feature on iPhoneOS</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">wget https://github.com/llvm/llvm-project/releases/download/llvmorg-11.0.0/openmp-11.0.0.src.tar.xz
</span></span><span class="line"><span class="cl">tar -xf openmp-11.0.0.src.tar.xz
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> openmp-11.0.0.src
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># apply some compilation fix</span>
</span></span><span class="line"><span class="cl">sed -i<span class="s1">&#39;&#39;</span> -e <span class="s1">&#39;/.size __kmp_unnamed_critical_addr/d&#39;</span> runtime/src/z_Linux_asm.S
</span></span><span class="line"><span class="cl">sed -i<span class="s1">&#39;&#39;</span> -e <span class="s1">&#39;s/__kmp_unnamed_critical_addr/___kmp_unnamed_critical_addr/g&#39;</span> runtime/src/z_Linux_asm.S
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">mkdir -p build-ios
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-ios
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/ios.toolchain.cmake -DCMAKE_BUILD_TYPE<span class="o">=</span>Release -DCMAKE_INSTALL_PREFIX<span class="o">=</span>install <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DIOS_PLATFORM<span class="o">=</span>OS -DENABLE_BITCODE<span class="o">=</span><span class="m">0</span> -DENABLE_ARC<span class="o">=</span><span class="m">0</span> -DENABLE_VISIBILITY<span class="o">=</span><span class="m">0</span> -DIOS_ARCH<span class="o">=</span><span class="s2">&#34;armv7;arm64;arm64e&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DPERL_EXECUTABLE<span class="o">=</span>/usr/local/bin/perl <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DLIBOMP_ENABLE_SHARED<span class="o">=</span>OFF -DLIBOMP_OMPT_SUPPORT<span class="o">=</span>OFF -DLIBOMP_USE_HWLOC<span class="o">=</span>OFF ..
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake --build . -j <span class="m">4</span>
</span></span><span class="line"><span class="cl">cmake --build . --target install
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># copy openmp library and header files to xcode toolchain sysroot</span>
</span></span><span class="line"><span class="cl">sudo cp install/include/* /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/usr/include
</span></span><span class="line"><span class="cl">sudo cp install/lib/libomp.a /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/usr/lib
</span></span></code></pre></div><p>Download and install openmp for multithreading inference feature on iPhoneSimulator</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">wget https://github.com/llvm/llvm-project/releases/download/llvmorg-11.0.0/openmp-11.0.0.src.tar.xz
</span></span><span class="line"><span class="cl">tar -xf openmp-11.0.0.src.tar.xz
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> openmp-11.0.0.src
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># apply some compilation fix</span>
</span></span><span class="line"><span class="cl">sed -i<span class="s1">&#39;&#39;</span> -e <span class="s1">&#39;/.size __kmp_unnamed_critical_addr/d&#39;</span> runtime/src/z_Linux_asm.S
</span></span><span class="line"><span class="cl">sed -i<span class="s1">&#39;&#39;</span> -e <span class="s1">&#39;s/__kmp_unnamed_critical_addr/___kmp_unnamed_critical_addr/g&#39;</span> runtime/src/z_Linux_asm.S
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">mkdir -p build-ios-sim
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-ios-sim
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/ios.toolchain.cmake -DCMAKE_BUILD_TYPE<span class="o">=</span>Release -DCMAKE_INSTALL_PREFIX<span class="o">=</span>install <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DIOS_PLATFORM<span class="o">=</span>SIMULATOR -DENABLE_BITCODE<span class="o">=</span><span class="m">0</span> -DENABLE_ARC<span class="o">=</span><span class="m">0</span> -DENABLE_VISIBILITY<span class="o">=</span><span class="m">0</span> -DIOS_ARCH<span class="o">=</span><span class="s2">&#34;i386;x86_64&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DPERL_EXECUTABLE<span class="o">=</span>/usr/local/bin/perl <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DLIBOMP_ENABLE_SHARED<span class="o">=</span>OFF -DLIBOMP_OMPT_SUPPORT<span class="o">=</span>OFF -DLIBOMP_USE_HWLOC<span class="o">=</span>OFF ..
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake --build . -j <span class="m">4</span>
</span></span><span class="line"><span class="cl">cmake --build . --target install
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># copy openmp library and header files to xcode toolchain sysroot</span>
</span></span><span class="line"><span class="cl">sudo cp install/include/* /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator.sdk/usr/include
</span></span><span class="line"><span class="cl">sudo cp install/lib/libomp.a /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator.sdk/usr/lib
</span></span></code></pre></div><p>Package openmp framework:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;openmp-root-dir&gt;
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">mkdir -p openmp.framework/Versions/A/Headers
</span></span><span class="line"><span class="cl">mkdir -p openmp.framework/Versions/A/Resources
</span></span><span class="line"><span class="cl">ln -s A openmp.framework/Versions/Current
</span></span><span class="line"><span class="cl">ln -s Versions/Current/Headers openmp.framework/Headers
</span></span><span class="line"><span class="cl">ln -s Versions/Current/Resources openmp.framework/Resources
</span></span><span class="line"><span class="cl">ln -s Versions/Current/openmp openmp.framework/openmp
</span></span><span class="line"><span class="cl">lipo -create build-ios/install/lib/libomp.a build-ios-sim/install/lib/libomp.a -o openmp.framework/Versions/A/openmp
</span></span><span class="line"><span class="cl">cp -r build-ios/install/include/* openmp.framework/Versions/A/Headers/
</span></span><span class="line"><span class="cl">sed -e <span class="s1">&#39;s/__NAME__/openmp/g&#39;</span> -e <span class="s1">&#39;s/__IDENTIFIER__/org.llvm.openmp/g&#39;</span> -e <span class="s1">&#39;s/__VERSION__/11.0/g&#39;</span> Info.plist &gt; openmp.framework/Versions/A/Resources/Info.plist
</span></span></code></pre></div><p>Download and install Vulkan SDK from <a href="https://vulkan.lunarg.com/sdk/home">https://vulkan.lunarg.com/sdk/home</a></p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">wget https://sdk.lunarg.com/sdk/download/1.2.189.0/mac/vulkansdk-macos-1.2.189.0.dmg?Human<span class="o">=</span><span class="nb">true</span> -O vulkansdk-macos-1.2.189.0.dmg
</span></span><span class="line"><span class="cl">hdiutil attach vulkansdk-macos-1.2.189.0.dmg
</span></span><span class="line"><span class="cl">sudo /Volumes/vulkansdk-macos-1.2.189.0/InstallVulkan.app/Contents/MacOS/InstallVulkan --root <span class="sb">`</span><span class="nb">pwd</span><span class="sb">`</span>/vulkansdk-macos-1.2.189.0 --accept-licenses --default-answer --confirm-command install
</span></span><span class="line"><span class="cl">hdiutil detach /Volumes/vulkansdk-macos-1.2.189.0
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># setup env</span>
</span></span><span class="line"><span class="cl"><span class="nb">export</span> <span class="nv">VULKAN_SDK</span><span class="o">=</span><span class="sb">`</span><span class="nb">pwd</span><span class="sb">`</span>/vulkansdk-macos-1.2.189.0/macOS
</span></span></code></pre></div><p>Build library for iPhoneOS:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;ncnn-root-dir&gt;
</span></span><span class="line"><span class="cl">mkdir -p build-ios
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-ios
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/ios.toolchain.cmake -DIOS_PLATFORM<span class="o">=</span>OS -DIOS_ARCH<span class="o">=</span><span class="s2">&#34;armv7;arm64;arm64e&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DENABLE_BITCODE<span class="o">=</span><span class="m">0</span> -DENABLE_ARC<span class="o">=</span><span class="m">0</span> -DENABLE_VISIBILITY<span class="o">=</span><span class="m">0</span> <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DOpenMP_C_FLAGS<span class="o">=</span><span class="s2">&#34;-Xclang -fopenmp&#34;</span> -DOpenMP_CXX_FLAGS<span class="o">=</span><span class="s2">&#34;-Xclang -fopenmp&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DOpenMP_C_LIB_NAMES<span class="o">=</span><span class="s2">&#34;libomp&#34;</span> -DOpenMP_CXX_LIB_NAMES<span class="o">=</span><span class="s2">&#34;libomp&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DOpenMP_libomp_LIBRARY<span class="o">=</span><span class="s2">&#34;/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/usr/lib/libomp.a&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DNCNN_BUILD_BENCHMARK<span class="o">=</span>OFF ..
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># vulkan is only available on arm64 devices</span>
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/ios.toolchain.cmake -DIOS_PLATFORM<span class="o">=</span>OS64 -DIOS_ARCH<span class="o">=</span><span class="s2">&#34;arm64;arm64e&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DENABLE_BITCODE<span class="o">=</span><span class="m">0</span> -DENABLE_ARC<span class="o">=</span><span class="m">0</span> -DENABLE_VISIBILITY<span class="o">=</span><span class="m">0</span> <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DOpenMP_C_FLAGS<span class="o">=</span><span class="s2">&#34;-Xclang -fopenmp&#34;</span> -DOpenMP_CXX_FLAGS<span class="o">=</span><span class="s2">&#34;-Xclang -fopenmp&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DOpenMP_C_LIB_NAMES<span class="o">=</span><span class="s2">&#34;libomp&#34;</span> -DOpenMP_CXX_LIB_NAMES<span class="o">=</span><span class="s2">&#34;libomp&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DOpenMP_libomp_LIBRARY<span class="o">=</span><span class="s2">&#34;/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/usr/lib/libomp.a&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DVulkan_INCLUDE_DIR<span class="o">=</span><span class="sb">`</span><span class="nb">pwd</span><span class="sb">`</span>/../vulkansdk-macos-1.2.189.0/MoltenVK/include <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DVulkan_LIBRARY<span class="o">=</span><span class="sb">`</span><span class="nb">pwd</span><span class="sb">`</span>/../vulkansdk-macos-1.2.189.0/MoltenVK/dylib/iOS/libMoltenVK.dylib <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DNCNN_VULKAN<span class="o">=</span>ON -DNCNN_BUILD_BENCHMARK<span class="o">=</span>OFF ..
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake --build . -j <span class="m">4</span>
</span></span><span class="line"><span class="cl">cmake --build . --target install
</span></span></code></pre></div><p>Build library for iPhoneSimulator:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;ncnn-root-dir&gt;
</span></span><span class="line"><span class="cl">mkdir -p build-ios-sim
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-ios-sim
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/ios.toolchain.cmake -DIOS_PLATFORM<span class="o">=</span>SIMULATOR -DIOS_ARCH<span class="o">=</span><span class="s2">&#34;i386;x86_64&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DENABLE_BITCODE<span class="o">=</span><span class="m">0</span> -DENABLE_ARC<span class="o">=</span><span class="m">0</span> -DENABLE_VISIBILITY<span class="o">=</span><span class="m">0</span> <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DOpenMP_C_FLAGS<span class="o">=</span><span class="s2">&#34;-Xclang -fopenmp&#34;</span> -DOpenMP_CXX_FLAGS<span class="o">=</span><span class="s2">&#34;-Xclang -fopenmp&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DOpenMP_C_LIB_NAMES<span class="o">=</span><span class="s2">&#34;libomp&#34;</span> -DOpenMP_CXX_LIB_NAMES<span class="o">=</span><span class="s2">&#34;libomp&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DOpenMP_libomp_LIBRARY<span class="o">=</span><span class="s2">&#34;/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator.sdk/usr/lib/libomp.a&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DNCNN_BUILD_BENCHMARK<span class="o">=</span>OFF ..
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake --build . -j <span class="m">4</span>
</span></span><span class="line"><span class="cl">cmake --build . --target install
</span></span></code></pre></div><p>Package glslang framework:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;ncnn-root-dir&gt;
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">mkdir -p glslang.framework/Versions/A/Headers
</span></span><span class="line"><span class="cl">mkdir -p glslang.framework/Versions/A/Resources
</span></span><span class="line"><span class="cl">ln -s A glslang.framework/Versions/Current
</span></span><span class="line"><span class="cl">ln -s Versions/Current/Headers glslang.framework/Headers
</span></span><span class="line"><span class="cl">ln -s Versions/Current/Resources glslang.framework/Resources
</span></span><span class="line"><span class="cl">ln -s Versions/Current/glslang glslang.framework/glslang
</span></span><span class="line"><span class="cl">libtool -static build-ios/install/lib/libglslang.a build-ios/install/lib/libSPIRV.a build-ios/install/lib/libOGLCompiler.a build-ios/install/lib/libOSDependent.a -o build-ios/install/lib/libglslang_combined.a
</span></span><span class="line"><span class="cl">libtool -static build-ios-sim/install/lib/libglslang.a build-ios-sim/install/lib/libSPIRV.a build-ios-sim/install/lib/libOGLCompiler.a build-ios-sim/install/lib/libOSDependent.a -o build-ios-sim/install/lib/libglslang_combined.a
</span></span><span class="line"><span class="cl">lipo -create build-ios/install/lib/libglslang_combined.a build-ios-sim/install/lib/libglslang_combined.a -o glslang.framework/Versions/A/glslang
</span></span><span class="line"><span class="cl">cp -r build/install/include/glslang glslang.framework/Versions/A/Headers/
</span></span><span class="line"><span class="cl">sed -e <span class="s1">&#39;s/__NAME__/glslang/g&#39;</span> -e <span class="s1">&#39;s/__IDENTIFIER__/org.khronos.glslang/g&#39;</span> -e <span class="s1">&#39;s/__VERSION__/1.0/g&#39;</span> Info.plist &gt; glslang.framework/Versions/A/Resources/Info.plist
</span></span></code></pre></div><p>Package ncnn framework:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;ncnn-root-dir&gt;
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">mkdir -p ncnn.framework/Versions/A/Headers
</span></span><span class="line"><span class="cl">mkdir -p ncnn.framework/Versions/A/Resources
</span></span><span class="line"><span class="cl">ln -s A ncnn.framework/Versions/Current
</span></span><span class="line"><span class="cl">ln -s Versions/Current/Headers ncnn.framework/Headers
</span></span><span class="line"><span class="cl">ln -s Versions/Current/Resources ncnn.framework/Resources
</span></span><span class="line"><span class="cl">ln -s Versions/Current/ncnn ncnn.framework/ncnn
</span></span><span class="line"><span class="cl">lipo -create build-ios/install/lib/libncnn.a build-ios-sim/install/lib/libncnn.a -o ncnn.framework/Versions/A/ncnn
</span></span><span class="line"><span class="cl">cp -r build-ios/install/include/* ncnn.framework/Versions/A/Headers/
</span></span><span class="line"><span class="cl">sed -e <span class="s1">&#39;s/__NAME__/ncnn/g&#39;</span> -e <span class="s1">&#39;s/__IDENTIFIER__/com.tencent.ncnn/g&#39;</span> -e <span class="s1">&#39;s/__VERSION__/1.0/g&#39;</span> Info.plist &gt; ncnn.framework/Versions/A/Resources/Info.plist
</span></span></code></pre></div><p>Pick <code>ncnn.framework</code> <code>glslang.framework</code> and <code>openmp.framework</code> folder for app development.</p>
<hr>
<h3 id="build-for-webassembly">Build for WebAssembly</h3>
<p>Install Emscripten</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">git clone https://github.com/emscripten-core/emsdk.git
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> emsdk
</span></span><span class="line"><span class="cl">./emsdk install 2.0.8
</span></span><span class="line"><span class="cl">./emsdk activate 2.0.8
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="nb">source</span> emsdk/emsdk_env.sh
</span></span></code></pre></div><p>Build without any extension for general compatibility:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">mkdir -p build
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../emsdk/upstream/emscripten/cmake/Modules/Platform/Emscripten.cmake <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DNCNN_THREADS<span class="o">=</span>OFF -DNCNN_OPENMP<span class="o">=</span>OFF -DNCNN_SIMPLEOMP<span class="o">=</span>OFF -DNCNN_RUNTIME_CPU<span class="o">=</span>OFF -DNCNN_SSE2<span class="o">=</span>OFF -DNCNN_AVX2<span class="o">=</span>OFF -DNCNN_AVX<span class="o">=</span>OFF <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DNCNN_BUILD_TOOLS<span class="o">=</span>OFF -DNCNN_BUILD_EXAMPLES<span class="o">=</span>OFF -DNCNN_BUILD_BENCHMARK<span class="o">=</span>OFF ..
</span></span><span class="line"><span class="cl">cmake --build . -j <span class="m">4</span>
</span></span><span class="line"><span class="cl">cmake --build . --target install
</span></span></code></pre></div><p>Build with WASM SIMD extension:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">mkdir -p build-simd
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-simd
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../emsdk/upstream/emscripten/cmake/Modules/Platform/Emscripten.cmake <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DNCNN_THREADS<span class="o">=</span>OFF -DNCNN_OPENMP<span class="o">=</span>OFF -DNCNN_SIMPLEOMP<span class="o">=</span>OFF -DNCNN_RUNTIME_CPU<span class="o">=</span>OFF -DNCNN_SSE2<span class="o">=</span>ON -DNCNN_AVX2<span class="o">=</span>OFF -DNCNN_AVX<span class="o">=</span>OFF <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DNCNN_BUILD_TOOLS<span class="o">=</span>OFF -DNCNN_BUILD_EXAMPLES<span class="o">=</span>OFF -DNCNN_BUILD_BENCHMARK<span class="o">=</span>OFF ..
</span></span><span class="line"><span class="cl">cmake --build . -j <span class="m">4</span>
</span></span><span class="line"><span class="cl">cmake --build . --target install
</span></span></code></pre></div><p>Build with WASM Thread extension:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">mkdir -p build-threads
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-threads
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../emsdk/upstream/emscripten/cmake/Modules/Platform/Emscripten.cmake <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DNCNN_THREADS<span class="o">=</span>ON -DNCNN_OPENMP<span class="o">=</span>ON -DNCNN_SIMPLEOMP<span class="o">=</span>ON -DNCNN_RUNTIME_CPU<span class="o">=</span>OFF -DNCNN_SSE2<span class="o">=</span>OFF -DNCNN_AVX2<span class="o">=</span>OFF -DNCNN_AVX<span class="o">=</span>OFF <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DNCNN_BUILD_TOOLS<span class="o">=</span>OFF -DNCNN_BUILD_EXAMPLES<span class="o">=</span>OFF -DNCNN_BUILD_BENCHMARK<span class="o">=</span>OFF ..
</span></span><span class="line"><span class="cl">cmake --build . -j <span class="m">4</span>
</span></span><span class="line"><span class="cl">cmake --build . --target install
</span></span></code></pre></div><p>Build with WASM SIMD and Thread extension:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">mkdir -p build-simd-threads
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-simd-threads
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../emsdk/upstream/emscripten/cmake/Modules/Platform/Emscripten.cmake <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DNCNN_THREADS<span class="o">=</span>ON -DNCNN_OPENMP<span class="o">=</span>ON -DNCNN_SIMPLEOMP<span class="o">=</span>ON -DNCNN_RUNTIME_CPU<span class="o">=</span>OFF -DNCNN_SSE2<span class="o">=</span>ON -DNCNN_AVX2<span class="o">=</span>OFF -DNCNN_AVX<span class="o">=</span>OFF <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DNCNN_BUILD_TOOLS<span class="o">=</span>OFF -DNCNN_BUILD_EXAMPLES<span class="o">=</span>OFF -DNCNN_BUILD_BENCHMARK<span class="o">=</span>OFF ..
</span></span><span class="line"><span class="cl">cmake --build . -j <span class="m">4</span>
</span></span><span class="line"><span class="cl">cmake --build . --target install
</span></span></code></pre></div><p>Pick <code>build-XYZ/install</code> folder for further usage.</p>
<hr>
<h3 id="build-for-allwinner-d1">Build for AllWinner D1</h3>
<p>Download c906 toolchain package from <a href="https://occ.t-head.cn/community/download?id=3913221581316624384">https://occ.t-head.cn/community/download?id=3913221581316624384</a></p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">tar -xf riscv64-linux-x86_64-20210512.tar.gz
</span></span><span class="line"><span class="cl"><span class="nb">export</span> <span class="nv">RISCV_ROOT_PATH</span><span class="o">=</span>/home/nihui/osd/riscv64-linux-x86_64-20210512
</span></span></code></pre></div><p>Build ncnn with riscv-v vector and simpleocv enabled:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">mkdir -p build-c906
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-c906
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/c906.toolchain.cmake <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DCMAKE_BUILD_TYPE<span class="o">=</span>relwithdebinfo -DNCNN_OPENMP<span class="o">=</span>OFF -DNCNN_THREADS<span class="o">=</span>OFF -DNCNN_RUNTIME_CPU<span class="o">=</span>OFF -DNCNN_RVV<span class="o">=</span>ON <span class="se">\
</span></span></span><span class="line"><span class="cl">    -DNCNN_SIMPLEOCV<span class="o">=</span>ON -DNCNN_BUILD_EXAMPLES<span class="o">=</span>ON ..
</span></span><span class="line"><span class="cl">cmake --build . -j <span class="m">4</span>
</span></span><span class="line"><span class="cl">cmake --build . --target install
</span></span></code></pre></div><p>Pick <code>build-c906/install</code> folder for further usage.</p>
<p>You can upload binary inside <code>build-c906/examples</code> folder and run on D1 board for testing.</p>
<hr>
<h3 id="build-for-loongson-2k1000">Build for Loongson 2K1000</h3>
<p>For gcc version &lt; 8.5, you need to fix msa.h header for workaround msa fmadd bug.</p>
<p>Open <code>/usr/lib/gcc/mips64el-linux-gnuabi64/8/include/msa.h</code>, find <code>__msa_fmadd_w</code> and apply changes as the following</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-c" data-lang="c"><span class="line"><span class="cl"><span class="c1">// #define __msa_fmadd_w __builtin_msa_fmadd_w
</span></span></span><span class="line"><span class="cl"><span class="cp">#define __msa_fmadd_w(a, b, c) __builtin_msa_fmadd_w(c, b, a)
</span></span></span></code></pre></div><p>Build ncnn with mips msa and simpleocv enabled:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">mkdir -p build
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build
</span></span><span class="line"><span class="cl">cmake -DNCNN_DISABLE_RTTI<span class="o">=</span>ON -DNCNN_DISABLE_EXCEPTION<span class="o">=</span>ON -DNCNN_RUNTIME_CPU<span class="o">=</span>OFF -DNCNN_MSA<span class="o">=</span>ON -DNCNN_MMI<span class="o">=</span>ON -DNCNN_SIMPLEOCV<span class="o">=</span>ON ..
</span></span><span class="line"><span class="cl">cmake --build . -j <span class="m">2</span>
</span></span><span class="line"><span class="cl">cmake --build . --target install
</span></span></code></pre></div><p>Pick <code>build/install</code> folder for further usage.</p>
<p>You can run binary inside <code>build/examples</code> folder for testing.</p>
<hr>
<h3 id="build-for-termux-on-android">Build for Termux on Android</h3>
<p>Install app Termux on your phone,and install Ubuntu in Termux.</p>
<p>If you want use ssh, just install openssh in Termux</p>
<pre tabindex="0"><code>pkg install proot-distro
proot-distro install ubuntu
</code></pre><p>or you can see what system can be installed using <code>proot-distro list</code></p>
<p>while you install ubuntu successfully, using <code>proot-distro login ubuntu</code> to login Ubuntu.</p>
<p>Then make ncnn,no need to install any other dependencies.</p>
<pre tabindex="0"><code>git clone https://github.com/Tencent/ncnn.git
cd ncnn
git submodule update --init
mkdir -p build
cd build
cmake -DCMAKE_BUILD_TYPE=Release -DNCNN_BUILD_EXAMPLES=ON -DNCNN_PLATFORM_API=OFF -DNCNN_SIMPLEOCV=ON ..
make -j$(nproc)
</code></pre><p>Then you can run a test</p>
<blockquote>
<p>on my Pixel 3 XL using Qualcomm 845,cant load <code>256-ncnn.png</code></p>
</blockquote>
<pre tabindex="0"><code>cd ../examples
../build/examples/squeezenet ../images/128-ncnn.png
</code></pre>]]></content:encoded>
    </item>
    <item>
      <title>在Android 11 上使用 LLDV 调试原生程序</title>
      <link>https://xgdebug.com/zh/posts/tech/debug/using-lldv-to-debug-native-android-11/</link>
      <pubDate>Sat, 28 Sep 2024 04:32:12 +0000</pubDate>
      <guid>https://xgdebug.com/zh/posts/tech/debug/using-lldv-to-debug-native-android-11/</guid>
      <description>本文详细介绍了如何通过 ADB 和 LLDB 在手机端和电脑端搭建调试环境，并提供了从启动调试服务到附加进程、设置断点等完整操作步骤。</description>
      <content:encoded><![CDATA[<h1 id="手机端">手机端</h1>
<h2 id="push-调试服务器到手机">PUSH 调试服务器到手机</h2>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">adb push lldb-server /data/local/tmp
</span></span><span class="line"><span class="cl">chmod <span class="m">755</span> /data/local/tmp/lldb-server
</span></span></code></pre></div><h2 id="启动调试器服务">启动调试器服务</h2>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">/data/local/tmp/lldb-server platform --listen <span class="s2">&#34;*:8888&#34;</span> --server
</span></span></code></pre></div><hr>
<h1 id="电脑端">电脑端</h1>
<h2 id="端口转发">端口转发</h2>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">adb forward tcp:8888 tcp:8888
</span></span></code></pre></div><h2 id="启动-lldb">启动 LLDB</h2>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">.<span class="se">\l</span>ldb.exe
</span></span></code></pre></div><h2 id="查看支持平台">查看支持平台</h2>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">platform list
</span></span></code></pre></div><h2 id="选-android-平台">选 ANDROID 平台</h2>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">platform <span class="k">select</span> remote-android
</span></span></code></pre></div><h2 id="连接到手机-手机序列号-9643e0ec0604-要换成当前调试的手机使用-adb-devices-查看序列号">连接到手机 手机序列号: <strong>9643e0ec0604</strong> 要换成当前调试的手机,使用 adb devices 查看序列号</h2>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">platform connect connect://9643e0ec0604:8888
</span></span></code></pre></div><h2 id="查看当前正在运行的进程">查看当前正在运行的进程</h2>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">platform process list
</span></span></code></pre></div><h2 id="附加上去">附加上去</h2>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">attach <span class="m">9053</span>
</span></span></code></pre></div><h2 id="下断点">下断点</h2>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">b send
</span></span></code></pre></div><h2 id="跑起来">跑起来</h2>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">c
</span></span></code></pre></div><h2 id="查看线程列表">查看线程列表</h2>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">thread list
</span></span></code></pre></div><h2 id="查看调用栈">查看调用栈</h2>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-shell" data-lang="shell"><span class="line"><span class="cl">bt
</span></span></code></pre></div>]]></content:encoded>
    </item>
    <item>
      <title>Arch安装stable-diffusion-webui中遇到的一些坑</title>
      <link>https://xgdebug.com/zh/posts/tech/linux/arch-sd-webui-pitfalls/</link>
      <pubDate>Tue, 12 Sep 2023 15:21:53 +0000</pubDate>
      <guid>https://xgdebug.com/zh/posts/tech/linux/arch-sd-webui-pitfalls/</guid>
      <description>&lt;p&gt;1. 不要使用清华的源，要用阿里的，因为清华的源不全&lt;br&gt;
2. 要使用 python launch.py 来安装一些 git 库&lt;br&gt;
3. 要安装 requirements_versions.txt 带版本的 pip 库&lt;br&gt;
4. 可以使用 python webui.py &amp;ndash;port=7860 &amp;ndash;server=0.0.0.0 &amp;ndash;medvram 节省显存&lt;/p&gt;</description>
      <content:encoded><![CDATA[<p>1. 不要使用清华的源，要用阿里的，因为清华的源不全<br>
2. 要使用 python launch.py 来安装一些 git 库<br>
3. 要安装 requirements_versions.txt 带版本的 pip 库<br>
4. 可以使用 python webui.py &ndash;port=7860 &ndash;server=0.0.0.0 &ndash;medvram 节省显存</p>
]]></content:encoded>
    </item>
    <item>
      <title>用 ncnn 部署 PyTorch 模型到 Android 手机</title>
      <link>https://xgdebug.com/zh/posts/tech/ai/use-ncnn-deploy-pytorch-model-android/</link>
      <pubDate>Fri, 11 Aug 2023 01:18:11 +0000</pubDate>
      <guid>https://xgdebug.com/zh/posts/tech/ai/use-ncnn-deploy-pytorch-model-android/</guid>
      <description>&lt;h2 id=&#34;使用-ncnn-将-pytorch-模型部署到安卓手机&#34;&gt;使用 ncnn 将 pytorch 模型部署到安卓手机&lt;/h2&gt;
&lt;ol&gt;
&lt;li&gt;在编译 NCNN 时开启图形卡支持。Vulkan 用于 GPU，设置 &lt;code&gt;-DNCNN_VULKAN=ON&lt;/code&gt;。&lt;/li&gt;
&lt;li&gt;MobileNetV3&lt;/li&gt;
&lt;/ol&gt;
&lt;h2 id=&#34;在编译到-mt-时开启-cmake-0091-特性&#34;&gt;在编译到 MT 时开启 CMAKE 0091 特性&lt;/h2&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cmake_minimum_required&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;VERSION 3.20.0&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cmake_policy&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;SET CMP0091 NEW&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;set&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;CMAKE_MSVC_RUNTIME_LIBRARY &lt;span class=&#34;s2&#34;&gt;&amp;#34;MultiThreaded&lt;/span&gt;$&lt;span class=&#34;s2&#34;&gt;&amp;lt;&lt;/span&gt;$&lt;span class=&#34;s2&#34;&gt;&amp;lt;CONFIG:Debug&amp;gt;:Debug&amp;gt;&amp;#34;&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;project&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;client-project&amp;#34;&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;h3 id=&#34;训练-yolo&#34;&gt;训练 YOLO&lt;/h3&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;se&#34;&gt;\E&lt;/span&gt;nvs&lt;span class=&#34;se&#34;&gt;\t&lt;/span&gt;orch&lt;span class=&#34;se&#34;&gt;\S&lt;/span&gt;cripts&lt;span class=&#34;se&#34;&gt;\a&lt;/span&gt;ctivate.ps1
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python train.py --batch &lt;span class=&#34;m&#34;&gt;6&lt;/span&gt; --workers &lt;span class=&#34;m&#34;&gt;2&lt;/span&gt; --imgsz &lt;span class=&#34;m&#34;&gt;960&lt;/span&gt; --epochs &lt;span class=&#34;m&#34;&gt;300&lt;/span&gt; --data &lt;span class=&#34;s2&#34;&gt;&amp;#34;\Core\yaml\data.yaml&amp;#34;&lt;/span&gt; --cfg &lt;span class=&#34;s2&#34;&gt;&amp;#34;\Core\yaml\cfg.yaml&amp;#34;&lt;/span&gt; --weights &lt;span class=&#34;se&#34;&gt;\ &lt;/span&gt;Core&lt;span class=&#34;se&#34;&gt;\w&lt;/span&gt;eights&lt;span class=&#34;se&#34;&gt;\b&lt;/span&gt;est.pt --device &lt;span class=&#34;m&#34;&gt;0&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;h4 id=&#34;模型转换&#34;&gt;模型转换&lt;/h4&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;from torch import nn
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;import torch.utils.model_zoo as model_zoo
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;import torch.onnx
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;from libs import define
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;from libs.net import Net
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;from libs.dataset import ImageDataset
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;import os
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nv&#34;&gt;test_data&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; ImageDataset&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;define.testPath,False&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nv&#34;&gt;test_loader&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; torch.utils.data.DataLoader&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt; test_data, &lt;span class=&#34;nv&#34;&gt;batch_size&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;1, &lt;span class=&#34;nv&#34;&gt;shuffle&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;True&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nv&#34;&gt;device&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; torch.device&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;cuda&amp;#34;&lt;/span&gt; &lt;span class=&#34;k&#34;&gt;if&lt;/span&gt; torch.cuda.is_available&lt;span class=&#34;o&#34;&gt;()&lt;/span&gt; &lt;span class=&#34;k&#34;&gt;else&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;cpu&amp;#34;&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nv&#34;&gt;model&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; Net&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;nv&#34;&gt;out_dim&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;19&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;.to&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;device&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;model.load_state_dict&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;torch.load&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;./widget/last.pt&amp;#34;&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;))&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;model.eval&lt;span class=&#34;o&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;def saveOnnx&lt;span class=&#34;o&#34;&gt;()&lt;/span&gt;:
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;k&#34;&gt;for&lt;/span&gt; data, target in test_loader:
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        data, &lt;span class=&#34;nv&#34;&gt;target&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; data.to&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;device&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;, target.to&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;device&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nv&#34;&gt;label&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; target.long&lt;span class=&#34;o&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nv&#34;&gt;y&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; model&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;data&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;c1&#34;&gt;# 导出模型&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        torch.onnx.export&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;model, &lt;span class=&#34;c1&#34;&gt;# 正在运行的模型&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                        data, &lt;span class=&#34;c1&#34;&gt;# 模型输入（或多个输入的元组）&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                        &lt;span class=&#34;s2&#34;&gt;&amp;#34;./widget/best.onnx&amp;#34;&lt;/span&gt;, &lt;span class=&#34;c1&#34;&gt;# 保存模型的位置（可以是文件或文件对象）&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                        &lt;span class=&#34;nv&#34;&gt;export_params&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;True, &lt;span class=&#34;c1&#34;&gt;# 将训练好的参数权重存储在模型文件中&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                        &lt;span class=&#34;nv&#34;&gt;opset_version&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;10, &lt;span class=&#34;c1&#34;&gt;# 导出模型的 ONNX 版本&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                        &lt;span class=&#34;nv&#34;&gt;do_constant_folding&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;True, &lt;span class=&#34;c1&#34;&gt;# 是否执行常量折叠以进行优化&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                        &lt;span class=&#34;nv&#34;&gt;input_names&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;[&lt;/span&gt;&lt;span class=&#34;s1&#34;&gt;&amp;#39;input&amp;#39;&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;]&lt;/span&gt;, &lt;span class=&#34;c1&#34;&gt;# 模型的输入名称&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                        &lt;span class=&#34;nv&#34;&gt;output_names&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;[&lt;/span&gt;&lt;span class=&#34;s1&#34;&gt;&amp;#39;output&amp;#39;&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;]&lt;/span&gt;, &lt;span class=&#34;c1&#34;&gt;# 模型的输出名称&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                        &lt;span class=&#34;nv&#34;&gt;dynamic_axes&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;={&lt;/span&gt;&lt;span class=&#34;s1&#34;&gt;&amp;#39;input&amp;#39;&lt;/span&gt;: &lt;span class=&#34;o&#34;&gt;{&lt;/span&gt;0:&lt;span class=&#34;s1&#34;&gt;&amp;#39;batch_size&amp;#39;&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;}&lt;/span&gt;, &lt;span class=&#34;c1&#34;&gt;# 可变长度轴&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                                        &lt;span class=&#34;s1&#34;&gt;&amp;#39;output&amp;#39;&lt;/span&gt;: &lt;span class=&#34;o&#34;&gt;{&lt;/span&gt;0:&lt;span class=&#34;s1&#34;&gt;&amp;#39;batch_size&amp;#39;&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;}})&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nv&#34;&gt;traced_script_module&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; torch.jit.trace&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;model, data&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;k&#34;&gt;return&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;saveOnnx&lt;span class=&#34;o&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# 转换&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;os.system&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;python -m onnxsim ./widget/best.onnx ./widgetbest-sim.onnx&amp;#34;&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;os.system&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;./bin/onnx2ncnn.exe ./widget/best-sim.onnx ./widget/best.param ./widget/best.bin&amp;#34;&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;os.system&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;./bin/ncnnoptimize.exe ./widget/best.param ./widget/best.bin ./widget/best-opt.param ./widget/best-opt.bin 65536&amp;#34;&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python .&lt;span class=&#34;se&#34;&gt;\e&lt;/span&gt;xport.py --weights weights/best.pt --img &lt;span class=&#34;m&#34;&gt;960&lt;/span&gt; --batch &lt;span class=&#34;m&#34;&gt;1&lt;/span&gt; --train
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python -m onnxsim best.onnx best-sim.onnx
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;.&lt;span class=&#34;se&#34;&gt;\o&lt;/span&gt;nnx2ncnn.exe best-sim.onnx best.param best.bin
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;ncnnoptimize best.param best.bin best-opt.param best-opt.bin &lt;span class=&#34;m&#34;&gt;65536&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;h3 id=&#34;git-克隆-ncnn-仓库及子模块&#34;&gt;Git 克隆 ncnn 仓库及子模块&lt;/h3&gt;
&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;$ git clone https://github.com/Tencent/ncnn.git
$ cd ncnn
$ git submodule update --init
&lt;/code&gt;&lt;/pre&gt;&lt;ul&gt;
&lt;li&gt;&lt;a href=&#34;https://xgdebug.com/zh/posts/tech/ai/use-ncnn-deploy-pytorch-model-android/#build-for-linux&#34;&gt;为 Linux 构建&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://xgdebug.com/zh/posts/tech/ai/use-ncnn-deploy-pytorch-model-android/#build-for-windows-x64-using-visual-studio-community-2017&#34;&gt;使用 VS2017 为 Windows x64 构建&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://xgdebug.com/zh/posts/tech/ai/use-ncnn-deploy-pytorch-model-android/#build-for-macos&#34;&gt;为 macOS 构建&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://xgdebug.com/zh/posts/tech/ai/use-ncnn-deploy-pytorch-model-android/#build-for-arm-cortex-a-family-with-cross-compiling&#34;&gt;使用交叉编译为 ARM Cortex-A 系列构建&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://xgdebug.com/zh/posts/tech/ai/use-ncnn-deploy-pytorch-model-android/#build-for-hisilicon-platform-with-cross-compiling&#34;&gt;使用交叉编译为 Hisilicon 平台构建&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://xgdebug.com/zh/posts/tech/ai/use-ncnn-deploy-pytorch-model-android/#build-for-android&#34;&gt;为 Android 构建&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://xgdebug.com/zh/posts/tech/ai/use-ncnn-deploy-pytorch-model-android/#build-for-ios-on-macos-with-xcode&#34;&gt;在 macOS 上使用 xcode 为 iOS 构建&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://xgdebug.com/zh/posts/tech/ai/use-ncnn-deploy-pytorch-model-android/#build-for-webassembly&#34;&gt;为 WebAssembly 构建&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://xgdebug.com/zh/posts/tech/ai/use-ncnn-deploy-pytorch-model-android/#build-for-allwinner-d1&#34;&gt;为 AllWinner D1 构建&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://xgdebug.com/zh/posts/tech/ai/use-ncnn-deploy-pytorch-model-android/#build-for-loongson-2k1000&#34;&gt;为 Loongson 2K1000 构建&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://xgdebug.com/zh/posts/tech/ai/use-ncnn-deploy-pytorch-model-android/#build-for-termux-on-android&#34;&gt;为 Android 上的 Termux 构建&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;h3 id=&#34;为-linux-构建&#34;&gt;为 Linux 构建&lt;/h3&gt;
&lt;p&gt;安装所需的构建依赖项：&lt;/p&gt;</description>
      <content:encoded><![CDATA[<h2 id="使用-ncnn-将-pytorch-模型部署到安卓手机">使用 ncnn 将 pytorch 模型部署到安卓手机</h2>
<ol>
<li>在编译 NCNN 时开启图形卡支持。Vulkan 用于 GPU，设置 <code>-DNCNN_VULKAN=ON</code>。</li>
<li>MobileNetV3</li>
</ol>
<h2 id="在编译到-mt-时开启-cmake-0091-特性">在编译到 MT 时开启 CMAKE 0091 特性</h2>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">cmake_minimum_required<span class="o">(</span>VERSION 3.20.0<span class="o">)</span>
</span></span><span class="line"><span class="cl">cmake_policy<span class="o">(</span>SET CMP0091 NEW<span class="o">)</span>
</span></span><span class="line"><span class="cl">set<span class="o">(</span>CMAKE_MSVC_RUNTIME_LIBRARY <span class="s2">&#34;MultiThreaded</span>$<span class="s2">&lt;</span>$<span class="s2">&lt;CONFIG:Debug&gt;:Debug&gt;&#34;</span><span class="o">)</span>
</span></span><span class="line"><span class="cl">project<span class="o">(</span><span class="s2">&#34;client-project&#34;</span><span class="o">)</span>
</span></span></code></pre></div><h3 id="训练-yolo">训练 YOLO</h3>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl"><span class="se">\E</span>nvs<span class="se">\t</span>orch<span class="se">\S</span>cripts<span class="se">\a</span>ctivate.ps1
</span></span><span class="line"><span class="cl">python train.py --batch <span class="m">6</span> --workers <span class="m">2</span> --imgsz <span class="m">960</span> --epochs <span class="m">300</span> --data <span class="s2">&#34;\Core\yaml\data.yaml&#34;</span> --cfg <span class="s2">&#34;\Core\yaml\cfg.yaml&#34;</span> --weights <span class="se">\ </span>Core<span class="se">\w</span>eights<span class="se">\b</span>est.pt --device <span class="m">0</span>
</span></span></code></pre></div><h4 id="模型转换">模型转换</h4>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">from torch import nn
</span></span><span class="line"><span class="cl">import torch.utils.model_zoo as model_zoo
</span></span><span class="line"><span class="cl">import torch.onnx
</span></span><span class="line"><span class="cl">from libs import define
</span></span><span class="line"><span class="cl">from libs.net import Net
</span></span><span class="line"><span class="cl">from libs.dataset import ImageDataset
</span></span><span class="line"><span class="cl">import os
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="nv">test_data</span> <span class="o">=</span> ImageDataset<span class="o">(</span>define.testPath,False<span class="o">)</span>
</span></span><span class="line"><span class="cl"><span class="nv">test_loader</span> <span class="o">=</span> torch.utils.data.DataLoader<span class="o">(</span> test_data, <span class="nv">batch_size</span><span class="o">=</span>1, <span class="nv">shuffle</span><span class="o">=</span>True<span class="o">)</span>
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="nv">device</span> <span class="o">=</span> torch.device<span class="o">(</span><span class="s2">&#34;cuda&#34;</span> <span class="k">if</span> torch.cuda.is_available<span class="o">()</span> <span class="k">else</span> <span class="s2">&#34;cpu&#34;</span><span class="o">)</span>
</span></span><span class="line"><span class="cl"><span class="nv">model</span> <span class="o">=</span> Net<span class="o">(</span><span class="nv">out_dim</span><span class="o">=</span>19<span class="o">)</span>.to<span class="o">(</span>device<span class="o">)</span>
</span></span><span class="line"><span class="cl">model.load_state_dict<span class="o">(</span>torch.load<span class="o">(</span> <span class="s2">&#34;./widget/last.pt&#34;</span> <span class="o">))</span>
</span></span><span class="line"><span class="cl">model.eval<span class="o">()</span>
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">def saveOnnx<span class="o">()</span>:
</span></span><span class="line"><span class="cl">    <span class="k">for</span> data, target in test_loader:
</span></span><span class="line"><span class="cl">        data, <span class="nv">target</span> <span class="o">=</span> data.to<span class="o">(</span>device<span class="o">)</span>, target.to<span class="o">(</span>device<span class="o">)</span>
</span></span><span class="line"><span class="cl">        <span class="nv">label</span> <span class="o">=</span> target.long<span class="o">()</span>
</span></span><span class="line"><span class="cl">        <span class="nv">y</span> <span class="o">=</span> model<span class="o">(</span>data<span class="o">)</span>
</span></span><span class="line"><span class="cl">        <span class="c1"># 导出模型</span>
</span></span><span class="line"><span class="cl">        torch.onnx.export<span class="o">(</span>model, <span class="c1"># 正在运行的模型</span>
</span></span><span class="line"><span class="cl">                        data, <span class="c1"># 模型输入（或多个输入的元组）</span>
</span></span><span class="line"><span class="cl">                        <span class="s2">&#34;./widget/best.onnx&#34;</span>, <span class="c1"># 保存模型的位置（可以是文件或文件对象）</span>
</span></span><span class="line"><span class="cl">                        <span class="nv">export_params</span><span class="o">=</span>True, <span class="c1"># 将训练好的参数权重存储在模型文件中</span>
</span></span><span class="line"><span class="cl">                        <span class="nv">opset_version</span><span class="o">=</span>10, <span class="c1"># 导出模型的 ONNX 版本</span>
</span></span><span class="line"><span class="cl">                        <span class="nv">do_constant_folding</span><span class="o">=</span>True, <span class="c1"># 是否执行常量折叠以进行优化</span>
</span></span><span class="line"><span class="cl">                        <span class="nv">input_names</span> <span class="o">=</span> <span class="o">[</span><span class="s1">&#39;input&#39;</span><span class="o">]</span>, <span class="c1"># 模型的输入名称</span>
</span></span><span class="line"><span class="cl">                        <span class="nv">output_names</span> <span class="o">=</span> <span class="o">[</span><span class="s1">&#39;output&#39;</span><span class="o">]</span>, <span class="c1"># 模型的输出名称</span>
</span></span><span class="line"><span class="cl">                        <span class="nv">dynamic_axes</span><span class="o">={</span><span class="s1">&#39;input&#39;</span>: <span class="o">{</span>0:<span class="s1">&#39;batch_size&#39;</span><span class="o">}</span>, <span class="c1"># 可变长度轴</span>
</span></span><span class="line"><span class="cl">                                        <span class="s1">&#39;output&#39;</span>: <span class="o">{</span>0:<span class="s1">&#39;batch_size&#39;</span><span class="o">}})</span>
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">        <span class="nv">traced_script_module</span> <span class="o">=</span> torch.jit.trace<span class="o">(</span>model, data<span class="o">)</span>
</span></span><span class="line"><span class="cl">        <span class="k">return</span>
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">saveOnnx<span class="o">()</span>
</span></span><span class="line"><span class="cl"><span class="c1"># 转换</span>
</span></span><span class="line"><span class="cl">os.system<span class="o">(</span><span class="s2">&#34;python -m onnxsim ./widget/best.onnx ./widgetbest-sim.onnx&#34;</span><span class="o">)</span>
</span></span><span class="line"><span class="cl">os.system<span class="o">(</span><span class="s2">&#34;./bin/onnx2ncnn.exe ./widget/best-sim.onnx ./widget/best.param ./widget/best.bin&#34;</span><span class="o">)</span>
</span></span><span class="line"><span class="cl">os.system<span class="o">(</span><span class="s2">&#34;./bin/ncnnoptimize.exe ./widget/best.param ./widget/best.bin ./widget/best-opt.param ./widget/best-opt.bin 65536&#34;</span><span class="o">)</span>
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">python .<span class="se">\e</span>xport.py --weights weights/best.pt --img <span class="m">960</span> --batch <span class="m">1</span> --train
</span></span><span class="line"><span class="cl">python -m onnxsim best.onnx best-sim.onnx
</span></span><span class="line"><span class="cl">.<span class="se">\o</span>nnx2ncnn.exe best-sim.onnx best.param best.bin
</span></span><span class="line"><span class="cl">ncnnoptimize best.param best.bin best-opt.param best-opt.bin <span class="m">65536</span>
</span></span></code></pre></div><h3 id="git-克隆-ncnn-仓库及子模块">Git 克隆 ncnn 仓库及子模块</h3>
<pre tabindex="0"><code>$ git clone https://github.com/Tencent/ncnn.git
$ cd ncnn
$ git submodule update --init
</code></pre><ul>
<li><a href="/zh/posts/tech/ai/use-ncnn-deploy-pytorch-model-android/#build-for-linux">为 Linux 构建</a></li>
<li><a href="/zh/posts/tech/ai/use-ncnn-deploy-pytorch-model-android/#build-for-windows-x64-using-visual-studio-community-2017">使用 VS2017 为 Windows x64 构建</a></li>
<li><a href="/zh/posts/tech/ai/use-ncnn-deploy-pytorch-model-android/#build-for-macos">为 macOS 构建</a></li>
<li><a href="/zh/posts/tech/ai/use-ncnn-deploy-pytorch-model-android/#build-for-arm-cortex-a-family-with-cross-compiling">使用交叉编译为 ARM Cortex-A 系列构建</a></li>
<li><a href="/zh/posts/tech/ai/use-ncnn-deploy-pytorch-model-android/#build-for-hisilicon-platform-with-cross-compiling">使用交叉编译为 Hisilicon 平台构建</a></li>
<li><a href="/zh/posts/tech/ai/use-ncnn-deploy-pytorch-model-android/#build-for-android">为 Android 构建</a></li>
<li><a href="/zh/posts/tech/ai/use-ncnn-deploy-pytorch-model-android/#build-for-ios-on-macos-with-xcode">在 macOS 上使用 xcode 为 iOS 构建</a></li>
<li><a href="/zh/posts/tech/ai/use-ncnn-deploy-pytorch-model-android/#build-for-webassembly">为 WebAssembly 构建</a></li>
<li><a href="/zh/posts/tech/ai/use-ncnn-deploy-pytorch-model-android/#build-for-allwinner-d1">为 AllWinner D1 构建</a></li>
<li><a href="/zh/posts/tech/ai/use-ncnn-deploy-pytorch-model-android/#build-for-loongson-2k1000">为 Loongson 2K1000 构建</a></li>
<li><a href="/zh/posts/tech/ai/use-ncnn-deploy-pytorch-model-android/#build-for-termux-on-android">为 Android 上的 Termux 构建</a></li>
</ul>
<hr>
<h3 id="为-linux-构建">为 Linux 构建</h3>
<p>安装所需的构建依赖项：</p>
<ul>
<li>git</li>
<li>g++</li>
<li>cmake</li>
<li>protocol buffer (protobuf) 头文件和 protobuf 编译器</li>
<li>vulkan 头文件和加载器库</li>
<li>glslang</li>
<li>(可选) opencv # 用于构建示例</li>
</ul>
<p>一般来说，如果你拥有过去 10 年的 Intel、AMD 或 Nvidia GPU，Vulkan 就可以轻松使用。</p>
<p>在某些系统上，目前（2020 年 10 月）可能没有易于获得的 Vulkan 驱动，因此你可能需要禁用 Vulkan 的使用。这适用于 Raspberry Pi 3（但正在开发中有一个实验性的开源 Vulkan 驱动，尚未准备好）。Nvidia Tegra 系列设备（如 Nvidia Jetson）应该支持 Vulkan。确保安装了最新的软件以获得最佳体验。</p>
<p>在 Debian、Ubuntu 或 Raspberry Pi OS 上，你可以使用以下命令安装所有必需的依赖项：</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">sudo apt install build-essential git cmake libprotobuf-dev protobuf-compiler libvulkan-dev vulkan-utils libopencv-dev
</span></span></code></pre></div><p>要使用 Vulkan 后端，你需要安装 Vulkan 头文件、一个 Vulkan 驱动加载器、GLSL 到 SPIR-V 编译器和 <code>vulkaninfo</code> 工具。最好从你的发行版仓库中获取。或者，从 <a href="https://vulkan.lunarg.com/sdk/home">https://vulkan.lunarg.com/sdk/home</a> 下载并安装完整的 Vulkan SDK（大小约为 200MB；它包含所有头文件、文档和预构建加载器，以及一些额外的工具和所有内容的源代码）。</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">wget https://sdk.lunarg.com/sdk/download/1.2.189.0/linux/vulkansdk-linux-x86_64-1.2.189.0.tar.gz?Human<span class="o">=</span><span class="nb">true</span> -O vulkansdk-linux-x86_64-1.2.189.0.tar.gz
</span></span><span class="line"><span class="cl">tar -xf vulkansdk-linux-x86_64-1.2.189.0.tar.gz
</span></span><span class="line"><span class="cl"><span class="nb">export</span> <span class="nv">VULKAN_SDK</span><span class="o">=</span><span class="k">$(</span><span class="nb">pwd</span><span class="k">)</span>/1.2.189.0/x86_64
</span></span></code></pre></div><p>在稍后构建 ncnn 时使用 Vulkan，你还需要为你的 GPU 拥有 Vulkan 驱动。对于 AMD 和 Intel GPU，这些可以在 Mesa 图形驱动中找到，通常在所有发行版中默认安装（例如，在 Debian/Ubuntu 上使用 <code>sudo apt install mesa-vulkan-drivers</code>）。对于 Nvidia GPU，必须下载并安装专有 Nvidia 驱动（某些发行版会以更简单的方式允许安装）。安装 Vulkan 驱动后，使用 <code>vulkaninfo</code> 或 <code>vulkaninfo | grep deviceType</code> 确认 Vulkan 库和驱动是否正常工作，它应该列出 GPU 设备类型。如果安装了多个 GPU（包括集成 GPU 和独立 GPU 的情况，常见于笔记本电脑），你可能需要记下设备的顺序以便稍后使用。</p>
<p>Nvidia Jetson 设备中的 Vulkan 支持应包含在 Nvidia 提供的 SDK (Jetpack) 或预构建的操作系统镜像中。</p>
<p>Raspberry Pi 的 Vulkan 驱动是存在的，但尚未成熟。你可以根据自己的判断进行实验，并报告结果和性能。</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl"><span class="nb">cd</span> ncnn
</span></span><span class="line"><span class="cl">mkdir -p build
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build
</span></span><span class="line"><span class="cl">cmake -DCMAKE_BUILD_TYPE<span class="o">=</span>Release -DNCNN_VULKAN<span class="o">=</span>ON -DNCNN_SYSTEM_GLSLANG<span class="o">=</span>ON -DNCNN_BUILD_EXAMPLES<span class="o">=</span>ON ..
</span></span><span class="line"><span class="cl">make -j<span class="k">$(</span>nproc<span class="k">)</span>
</span></span></code></pre></div><p>你可以向上面的 <code>cmake</code> 添加 <code>-GNinja</code> 来使用 Ninja 构建系统（使用 <code>ninja</code> 或 <code>cmake --build .</code> 调用构建）。</p>
<p>对于 Nvidia Jetson 设备，向 <code>cmake</code> 添加 <code>-DCMAKE_TOOLCHAIN_FILE=../toolchains/jetson.toolchain.cmake</code>。</p>
<p>对于 Rasberry Pi 3，向 <code>cmake</code> 添加 <code>-DCMAKE_TOOLCHAIN_FILE=../toolchains/pi3.toolchain.cmake -DPI3=ON</code>。你也可以考虑禁用 Vulkan 支持，因为 Raspberry Pi 的 Vulkan 驱动仍不成熟，但启用支持并未使用它也不会有坏处。</p>
<p>通过运行一些示例来验证构建：</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl"><span class="nb">cd</span> ../examples
</span></span><span class="line"><span class="cl">../build/examples/squeezenet ../images/256-ncnn.png
</span></span><span class="line"><span class="cl"><span class="o">[</span><span class="m">0</span> AMD RADV FIJI <span class="o">(</span>LLVM 10.0.1<span class="o">)]</span> <span class="nv">queueC</span><span class="o">=</span>1<span class="o">[</span>4<span class="o">]</span> <span class="nv">queueG</span><span class="o">=</span>0<span class="o">[</span>1<span class="o">]</span> <span class="nv">queueT</span><span class="o">=</span>0<span class="o">[</span>1<span class="o">]</span>
</span></span><span class="line"><span class="cl"><span class="o">[</span><span class="m">0</span> AMD RADV FIJI <span class="o">(</span>LLVM 10.0.1<span class="o">)]</span> <span class="nv">bugsbn1</span><span class="o">=</span><span class="m">0</span> <span class="nv">buglbia</span><span class="o">=</span><span class="m">0</span> <span class="nv">bugcopc</span><span class="o">=</span><span class="m">0</span> <span class="nv">bugihfa</span><span class="o">=</span><span class="m">0</span>
</span></span><span class="line"><span class="cl"><span class="o">[</span><span class="m">0</span> AMD RADV FIJI <span class="o">(</span>LLVM 10.0.1<span class="o">)]</span> <span class="nv">fp16p</span><span class="o">=</span><span class="m">1</span> <span class="nv">fp16s</span><span class="o">=</span><span class="m">1</span> <span class="nv">fp16a</span><span class="o">=</span><span class="m">0</span> <span class="nv">int8s</span><span class="o">=</span><span class="m">1</span> <span class="nv">int8a</span><span class="o">=</span><span class="m">1</span>
</span></span><span class="line"><span class="cl"><span class="nv">532</span> <span class="o">=</span> 0.163452
</span></span><span class="line"><span class="cl"><span class="nv">920</span> <span class="o">=</span> 0.093140
</span></span><span class="line"><span class="cl"><span class="nv">716</span> <span class="o">=</span> 0.061584
</span></span></code></pre></div><p>你也可以运行基准测试（第 4 个参数是使用的 GPU 设备索引，参考 <code>vulkaninfo</code>，如果你有多个 GPU）：</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl"><span class="nb">cd</span> ../benchmark
</span></span><span class="line"><span class="cl">../build/benchmark/benchncnn <span class="m">10</span> <span class="k">$(</span>nproc<span class="k">)</span> <span class="m">0</span> <span class="m">0</span>
</span></span><span class="line"><span class="cl"><span class="o">[</span><span class="m">0</span> AMD RADV FIJI <span class="o">(</span>LLVM 10.0.1<span class="o">)]</span> <span class="nv">queueC</span><span class="o">=</span>1<span class="o">[</span>4<span class="o">]</span> <span class="nv">queueG</span><span class="o">=</span>0<span class="o">[</span>1<span class="o">]</span> <span class="nv">queueT</span><span class="o">=</span>0<span class="o">[</span>1<span class="o">]</span>
</span></span><span class="line"><span class="cl"><span class="o">[</span><span class="m">0</span> AMD RADV FIJI <span class="o">(</span>LLVM 10.0.1<span class="o">)]</span> <span class="nv">bugsbn1</span><span class="o">=</span><span class="m">0</span> <span class="nv">buglbia</span><span class="o">=</span><span class="m">0</span> <span class="nv">bugcopc</span><span class="o">=</span><span class="m">0</span> <span class="nv">bugihfa</span><span class="o">=</span><span class="m">0</span>
</span></span><span class="line"><span class="cl"><span class="o">[</span><span class="m">0</span> AMD RADV FIJI <span class="o">(</span>LLVM 10.0.1<span class="o">)]</span> <span class="nv">fp16p</span><span class="o">=</span><span class="m">1</span> <span class="nv">fp16s</span><span class="o">=</span><span class="m">1</span> <span class="nv">fp16a</span><span class="o">=</span><span class="m">0</span> <span class="nv">int8s</span><span class="o">=</span><span class="m">1</span> <span class="nv">int8a</span><span class="o">=</span><span class="m">1</span>
</span></span><span class="line"><span class="cl"><span class="nv">num_threads</span> <span class="o">=</span> <span class="m">4</span>
</span></span><span class="line"><span class="cl"><span class="nv">powersave</span> <span class="o">=</span> <span class="m">0</span>
</span></span><span class="line"><span class="cl"><span class="nv">gpu_device</span> <span class="o">=</span> <span class="m">0</span>
</span></span><span class="line"><span class="cl"><span class="nv">cooling_down</span> <span class="o">=</span> <span class="m">1</span>
</span></span><span class="line"><span class="cl">squeezenet <span class="nv">min</span> <span class="o">=</span> 4.68 <span class="nv">max</span> <span class="o">=</span> 4.99 <span class="nv">avg</span> <span class="o">=</span> 4.85
</span></span><span class="line"><span class="cl">squeezenet_int8 <span class="nv">min</span> <span class="o">=</span> 38.52 <span class="nv">max</span> <span class="o">=</span> 66.90 <span class="nv">avg</span> <span class="o">=</span> 48.52
</span></span></code></pre></div><p>要在 CPU 上运行基准测试，将第 5 个参数设置为 <code>-1</code>。</p>
<hr>
<h3 id="使用-visual-studio-community-2017-为-windows-x64-构建">使用 Visual Studio Community 2017 为 Windows x64 构建</h3>
<p>从 <a href="https://visualstudio.microsoft.com/vs/community/">https://visualstudio.microsoft.com/vs/community/</a> 下载并安装 Visual Studio Community 2017。</p>
<p>启动命令提示符：<code>Start → Programs → Visual Studio 2017 → Visual Studio Tools → x64 Native Tools Command Prompt for VS 2017</code></p>
<p>从 <a href="https://github.com/google/protobuf/archive/v3.4.0.zip">https://github.com/google/protobuf/archive/v3.4.0.zip</a> 下载 protobuf-3.4.0。</p>
<p>构建 protobuf 库：</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;protobuf-root-dir&gt;
</span></span><span class="line"><span class="cl">mkdir build
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build
</span></span><span class="line"><span class="cl">cmake -G<span class="s2">&#34;NMake Makefiles&#34;</span> -DCMAKE_BUILD_TYPE<span class="o">=</span>Release -DCMAKE_INSTALL_PREFIX<span class="o">=</span>%cd%/install -Dprotobuf_BUILD_TESTS<span class="o">=</span>OFF -Dprotobuf_MSVC_STATIC_RUNTIME<span class="o">=</span>OFF ../cmake
</span></span><span class="line"><span class="cl">nmake
</span></span><span class="line"><span class="cl">nmake install
</span></span></code></pre></div><p>(可选) 从 <a href="https://vulkan.lunarg.com/sdk/home">https://vulkan.lunarg.com/sdk/home</a> 下载并安装 Vulkan SDK。</p>
<p>构建 ncnn 库（将 <code>protobuf-root-dir</code> 替换为正确的路径）：</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;ncnn-root-dir&gt;
</span></span><span class="line"><span class="cl">mkdir -p build
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build
</span></span><span class="line"><span class="cl">cmake -G<span class="s2">&#34;NMake Makefiles&#34;</span> -DCMAKE_BUILD_TYPE<span class="o">=</span>Release -DCMAKE_INSTALL_PREFIX<span class="o">=</span>%cd%/install -DProtobuf_INCLUDE_DIR<span class="o">=</span>&lt;protobuf-root-dir&gt;/build/install/include -DProtobuf_LIBRARIES<span class="o">=</span>&lt;protobuf-root-dir&gt;/build/install/lib/libprotobuf.lib -DProtobuf_PROTOC_EXECUTABLE<span class="o">=</span>&lt;protobuf-root-dir&gt;/build/install/bin/protoc.exe -DNCNN_VULKAN<span class="o">=</span>ON ..
</span></span><span class="line"><span class="cl">nmake
</span></span><span class="line"><span class="cl">nmake install
</span></span></code></pre></div><p>注意：为了加快多核机器上的编译过程，推荐使用 <code>-G</code> 标志配置 <code>cmake</code> 来使用 <code>jom</code> 或 <code>ninja</code>。</p>
<hr>
<h3 id="为-macos-构建">为 macOS 构建</h3>
<p>首先根据你的需求安装 Xcode 或 Xcode 命令行工具。</p>
<p>然后通过 homebrew 安装 <code>protobuf</code> 和 <code>libomp</code>。</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">brew install protobuf libomp
</span></span></code></pre></div><p>从 <a href="https://vulkan.lunarg.com/sdk/home">https://vulkan.lunarg.com/sdk/home</a> 下载并安装 Vulkan SDK。</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">wget https://sdk.lunarg.com/sdk/download/1.2.189.0/mac/vulkansdk-macos-1.2.189.0.dmg?Human<span class="o">=</span><span class="nb">true</span> -O vulkansdk-macos-1.2.189.0.dmg
</span></span><span class="line"><span class="cl">hdiutil attach vulkansdk-macos-1.2.189.0.dmg
</span></span><span class="line"><span class="cl">sudo /Volumes/vulkansdk-macos-1.2.189.0/InstallVulkan.app/Contents/MacOS/InstallVulkan --root <span class="sb">`</span><span class="nb">pwd</span><span class="sb">`</span>/vulkansdk-macos-1.2.189.0 --accept-licenses --default-answer --confirm-command install
</span></span><span class="line"><span class="cl">hdiutil detach /Volumes/vulkansdk-macos-1.2.189.0
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># 设置环境变量</span>
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="nb">export</span> <span class="nv">VULKAN_SDK</span><span class="o">=</span><span class="sb">`</span><span class="nb">pwd</span><span class="sb">`</span>/vulkansdk-macos-1.2.189.0/macOS
</span></span></code></pre></div><div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;ncnn-root-dir&gt;
</span></span><span class="line"><span class="cl">mkdir -p build
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake -DCMAKE_OSX_ARCHITECTURES<span class="o">=</span><span class="s2">&#34;x86_64;arm64&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DVulkan_INCLUDE_DIR<span class="o">=</span><span class="sb">`</span><span class="nb">pwd</span><span class="sb">`</span>/../vulkansdk-macos-1.2.189.0/MoltenVK/include <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DVulkan_LIBRARY<span class="o">=</span><span class="sb">`</span><span class="nb">pwd</span><span class="sb">`</span>/../vulkansdk-macos-1.2.189.0/MoltenVK/dylib/macOS/libMoltenVK.dylib <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DNCNN_VULKAN<span class="o">=</span>ON -DNCNN_BUILD_EXAMPLES<span class="o">=</span>ON ..
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake --build . -j <span class="m">4</span>
</span></span><span class="line"><span class="cl">cmake --build . --target install
</span></span></code></pre></div><p><em>注意：如果在安装过程中遇到与 <code>libomp</code> 相关的错误，你也可以查看我们的 GitHub Actions <a href="https://github.com/Tencent/ncnn/blob/d91cccf/.github/workflows/macos-x64-gpu.yml#L50-L68">此处</a> 来安装和使用 <code>openmp</code>。</em></p>
<hr>
<h3 id="使用交叉编译为-arm-cortex-a-系列构建">使用交叉编译为 ARM Cortex-A 系列构建</h3>
<p>从 <a href="https://developer.arm.com/open-source/gnu-toolchain/gnu-a/downloads">https://developer.arm.com/open-source/gnu-toolchain/gnu-a/downloads</a> 下载 ARM 工具链。</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl"><span class="nb">export</span> <span class="nv">PATH</span><span class="o">=</span><span class="s2">&#34;&lt;your-toolchain-compiler-path&gt;:</span><span class="si">${</span><span class="nv">PATH</span><span class="si">}</span><span class="s2">&#34;</span>
</span></span></code></pre></div><p>或者安装发行版提供的交叉编译器（例如，在 Debian / Ubuntu 上，你可以执行 <code>sudo apt install g++-arm-linux-gnueabi g++-arm-linux-gnueabihf g++-aarch64-linux-gnu</code>）。</p>
<p>根据你的需求，构建以下一个或多个目标。</p>
<p>AArch32 软浮点目标 (arm-linux-gnueabi)</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;ncnn-root-dir&gt;
</span></span><span class="line"><span class="cl">mkdir -p build-arm-linux-gnueabi
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-arm-linux-gnueabi
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/arm-linux-gnueabi.toolchain.cmake ..
</span></span><span class="line"><span class="cl">make -j<span class="k">$(</span>nproc<span class="k">)</span>
</span></span></code></pre></div><p>AArch32 硬浮点目标 (arm-linux-gnueabihf)</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;ncnn-root-dir&gt;
</span></span><span class="line"><span class="cl">mkdir -p build-arm-linux-gnueabihf
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-arm-linux-gnueabihf
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/arm-linux-gnueabihf.toolchain.cmake ..
</span></span><span class="line"><span class="cl">make -j<span class="k">$(</span>nproc<span class="k">)</span>
</span></span></code></pre></div><p>AArch64 GNU/Linux 目标 (aarch64-linux-gnu)</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;ncnn-root-dir&gt;
</span></span><span class="line"><span class="cl">mkdir -p build-aarch64-linux-gnu
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-aarch64-linux-gnu
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/aarch64-linux-gnu.toolchain.cmake ..
</span></span><span class="line"><span class="cl">make -j<span class="k">$(</span>nproc<span class="k">)</span>
</span></span></code></pre></div><hr>
<h3 id="使用交叉编译为-hisilicon-平台构建">使用交叉编译为 Hisilicon 平台构建</h3>
<p>下载并安装 Hisilicon SDK。工具链应位于 <code>/opt/hisi-linux/x86-arm</code>。</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;ncnn-root-dir&gt;
</span></span><span class="line"><span class="cl">mkdir -p build
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># 选择一个 cmake 工具链文件，取决于你的目标平台</span>
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/hisiv300.toolchain.cmake ..
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/hisiv500.toolchain.cmake ..
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/himix100.toolchain.cmake ..
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/himix200.toolchain.cmake ..
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">make -j<span class="k">$(</span>nproc<span class="k">)</span>
</span></span><span class="line"><span class="cl">make install
</span></span></code></pre></div><hr>
<h3 id="为-android-构建">为 Android 构建</h3>
<p>你可以使用来自 <a href="https://github.com/Tencent/ncnn/releases">https://github.com/Tencent/ncnn/releases</a> 的预构建 ncnn-android-lib.zip。</p>
<p>从 <a href="http://developer.android.com/ndk/downloads/index.html">http://developer.android.com/ndk/downloads/index.html</a> 下载 Android NDK 并安装它，例如：</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">unzip android-ndk-r21d-linux-x86_64.zip
</span></span><span class="line"><span class="cl"><span class="nb">export</span> <span class="nv">ANDROID_NDK</span><span class="o">=</span>&lt;your-ndk-root-path&gt;
</span></span></code></pre></div><p>(可选) 移除 Android NDK 中的硬编码调试标志 <a href="https://github.com/android-ndk/ndk/issues/243">android-ndk issue</a>。</p>
<pre tabindex="0"><code># 打开 $ANDROID_NDK/build/cmake/android.toolchain.cmake

# 删除 &#34;-g&#34; 行

list(APPEND ANDROID_COMPILER_FLAGS
-g
-DANDROID
</code></pre><p>构建 armv7 库</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;ncnn-root-dir&gt;
</span></span><span class="line"><span class="cl">mkdir -p build-android-armv7
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-android-armv7
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span><span class="s2">&#34;</span><span class="nv">$ANDROID_NDK</span><span class="s2">/build/cmake/android.toolchain.cmake&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DANDROID_ABI<span class="o">=</span><span class="s2">&#34;armeabi-v7a&#34;</span> -DANDROID_ARM_NEON<span class="o">=</span>ON <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DANDROID_PLATFORM<span class="o">=</span>android-14 ..
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># 如果你想启用 Vulkan，则需要平台 API 版本 &gt;= android-24</span>
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span><span class="s2">&#34;</span><span class="nv">$ANDROID_NDK</span><span class="s2">/build/cmake/android.toolchain.cmake&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DANDROID_ABI<span class="o">=</span><span class="s2">&#34;armeabi-v7a&#34;</span> -DANDROID_ARM_NEON<span class="o">=</span>ON <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DANDROID_PLATFORM<span class="o">=</span>android-24 -DNCNN_VULKAN<span class="o">=</span>ON ..
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">make -j<span class="k">$(</span>nproc<span class="k">)</span>
</span></span><span class="line"><span class="cl">make install
</span></span></code></pre></div><p>选择 <code>build-android-armv7/install</code> 文件夹用于进一步的 JNI 使用。</p>
<p>构建 aarch64 库：</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;ncnn-root-dir&gt;
</span></span><span class="line"><span class="cl">mkdir -p build-android-aarch64
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-android-aarch64
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span><span class="s2">&#34;</span><span class="nv">$ANDROID_NDK</span><span class="s2">/build/cmake/android.toolchain.cmake&#34;</span><span class="se">\
</span></span></span><span class="line"><span class="cl"> -DANDROID_ABI<span class="o">=</span><span class="s2">&#34;arm64-v8a&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DANDROID_PLATFORM<span class="o">=</span>android-21 ..
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># 如果你想启用 Vulkan，则需要平台 API 版本 &gt;= android-24</span>
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span><span class="s2">&#34;</span><span class="nv">$ANDROID_NDK</span><span class="s2">/build/cmake/android.toolchain.cmake&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DANDROID_ABI<span class="o">=</span><span class="s2">&#34;arm64-v8a&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DANDROID_PLATFORM<span class="o">=</span>android-24 -DNCNN_VULKAN<span class="o">=</span>ON ..
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">make -j<span class="k">$(</span>nproc<span class="k">)</span>
</span></span><span class="line"><span class="cl">make install
</span></span></code></pre></div><p>选择 <code>build-android-aarch64/install</code> 文件夹用于进一步的 JNI 使用。</p>
<hr>
<h3 id="在-macos-上使用-xcode-为-ios-构建">在 macOS 上使用 xcode 为 iOS 构建</h3>
<p>你可以使用来自 <a href="https://github.com/Tencent/ncnn/releases">https://github.com/Tencent/ncnn/releases</a> 的预构建 ncnn.framework、glslang.framework 和 openmp.framework。</p>
<p>安装 xcode</p>
<p>如果你想构建启用 bitcode 的库，可以将以下 cmake 参数中的 <code>-DENABLE_BITCODE=0</code> 替换为 <code>-DENABLE_BITCODE=1</code>。</p>
<p>下载并安装 openmp 以启用 iPhoneOS 上的多线程推理功能</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">wget https://github.com/llvm/llvm-project/releases/download/llvmorg-11.0.0/openmp-11.0.0.src.tar.xz
</span></span><span class="line"><span class="cl">tar -xf openmp-11.0.0.src.tar.xz
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> openmp-11.0.0.src
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># 应用一些编译修复</span>
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">sed -i<span class="s1">&#39;&#39;</span> -e <span class="s1">&#39;/.size **kmp_unnamed_critical_addr/d&#39;</span> runtime/src/z_Linux_asm.S
</span></span><span class="line"><span class="cl">sed -i<span class="s1">&#39;&#39;</span> -e <span class="s1">&#39;s/**kmp_unnamed_critical_addr/\_\_\_kmp_unnamed_critical_addr/g&#39;</span> runtime/src/z_Linux_asm.S
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">mkdir -p build-ios
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-ios
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/ios.toolchain.cmake -DCMAKE_BUILD_TYPE<span class="o">=</span>Release -DCMAKE_INSTALL_PREFIX<span class="o">=</span>install <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DIOS_PLATFORM<span class="o">=</span>OS -DENABLE_BITCODE<span class="o">=</span><span class="m">0</span> -DENABLE_ARC<span class="o">=</span><span class="m">0</span> -DENABLE_VISIBILITY<span class="o">=</span><span class="m">0</span> -DIOS_ARCH<span class="o">=</span><span class="s2">&#34;armv7;arm64;arm64e&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DPERL_EXECUTABLE<span class="o">=</span>/usr/local/bin/perl <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DLIBOMP_ENABLE_SHARED<span class="o">=</span>OFF -DLIBOMP_OMPT_SUPPORT<span class="o">=</span>OFF -DLIBOMP_USE_HWLOC<span class="o">=</span>OFF ..
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake --build . -j <span class="m">4</span>
</span></span><span class="line"><span class="cl">cmake --build . --target install
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># 将 openmp 库和头文件复制到 xcode 工具链 sysroot</span>
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">sudo cp install/include/<span class="se">\*</span> /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/usr/include
</span></span><span class="line"><span class="cl">sudo cp install/lib/libomp.a /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/usr/lib
</span></span></code></pre></div><p>下载并安装 openmp 以启用 iPhoneSimulator 上的多线程推理功能</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">wget https://github.com/llvm/llvm-project/releases/download/llvmorg-11.0.0/openmp-11.0.0.src.tar.xz
</span></span><span class="line"><span class="cl">tar -xf openmp-11.0.0.src.tar.xz
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> openmp-11.0.0.src
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># 应用一些编译修复</span>
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">sed -i<span class="s1">&#39;&#39;</span> -e <span class="s1">&#39;/.size **kmp_unnamed_critical_addr/d&#39;</span> runtime/src/z_Linux_asm.S
</span></span><span class="line"><span class="cl">sed -i<span class="s1">&#39;&#39;</span> -e <span class="s1">&#39;s/**kmp_unnamed_critical_addr/\_\_\_kmp_unnamed_critical_addr/g&#39;</span> runtime/src/z_Linux_asm.S
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">mkdir -p build-ios-sim
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-ios-sim
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/ios.toolchain.cmake -DCMAKE_BUILD_TYPE<span class="o">=</span>Release -DCMAKE_INSTALL_PREFIX<span class="o">=</span>install <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DIOS_PLATFORM<span class="o">=</span>SIMULATOR -DENABLE_BITCODE<span class="o">=</span><span class="m">0</span> -DENABLE_ARC<span class="o">=</span><span class="m">0</span> -DENABLE_VISIBILITY<span class="o">=</span><span class="m">0</span> -DIOS_ARCH<span class="o">=</span><span class="s2">&#34;i386;x86_64&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DPERL_EXECUTABLE<span class="o">=</span>/usr/local/bin/perl <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DLIBOMP_ENABLE_SHARED<span class="o">=</span>OFF -DLIBOMP_OMPT_SUPPORT<span class="o">=</span>OFF -DLIBOMP_USE_HWLOC<span class="o">=</span>OFF ..
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake --build . -j <span class="m">4</span>
</span></span><span class="line"><span class="cl">cmake --build . --target install
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># 将 openmp 库和头文件复制到 xcode 工具链 sysroot</span>
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">sudo cp install/include/<span class="se">\*</span> /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator.sdk/usr/include
</span></span><span class="line"><span class="cl">sudo cp install/lib/libomp.a /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator.sdk/usr/lib
</span></span></code></pre></div><p>打包 openmp 框架：</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;openmp-root-dir&gt;
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">mkdir -p openmp.framework/Versions/A/Headers
</span></span><span class="line"><span class="cl">mkdir -p openmp.framework/Versions/A/Resources
</span></span><span class="line"><span class="cl">ln -s A openmp.framework/Versions/Current
</span></span><span class="line"><span class="cl">ln -s Versions/Current/Headers openmp.framework/Headers
</span></span><span class="line"><span class="cl">ln -s Versions/Current/Resources openmp.framework/Resources
</span></span><span class="line"><span class="cl">ln -s Versions/Current/openmp openmp.framework/openmp
</span></span><span class="line"><span class="cl">lipo -create build-ios/install/lib/libomp.a build-ios-sim/install/lib/libomp.a -o openmp.framework/Versions/A/openmp
</span></span><span class="line"><span class="cl">cp -r build-ios/install/include/<span class="se">\*</span> openmp.framework/Versions/A/Headers/
</span></span><span class="line"><span class="cl">sed -e <span class="s1">&#39;s/**NAME**/openmp/g&#39;</span> -e <span class="s1">&#39;s/**IDENTIFIER**/org.llvm.openmp/g&#39;</span> -e <span class="s1">&#39;s/**VERSION**/11.0/g&#39;</span> Info.plist &gt; openmp.framework/Versions/A/Resources/Info.plist
</span></span></code></pre></div><p>下载并安装 Vulkan SDK。</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">wget https://sdk.lunarg.com/sdk/download/1.2.189.0/mac/vulkansdk-macos-1.2.189.0.dmg?Human<span class="o">=</span><span class="nb">true</span> -O vulkansdk-macos-1.2.189.0.dmg
</span></span><span class="line"><span class="cl">hdiutil attach vulkansdk-macos-1.2.189.0.dmg
</span></span><span class="line"><span class="cl">sudo /Volumes/vulkansdk-macos-1.2.189.0/InstallVulkan.app/Contents/MacOS/InstallVulkan --root <span class="sb">`</span><span class="nb">pwd</span><span class="sb">`</span>/vulkansdk-macos-1.2.189.0 --accept-licenses --default-answer --confirm-command install
</span></span><span class="line"><span class="cl">hdiutil detach /Volumes/vulkansdk-macos-1.2.189.0
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># 设置环境变量</span>
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="nb">export</span> <span class="nv">VULKAN_SDK</span><span class="o">=</span><span class="sb">`</span><span class="nb">pwd</span><span class="sb">`</span>/vulkansdk-macos-1.2.189.0/macOS
</span></span></code></pre></div><p>为 iPhoneOS 构建库：</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;ncnn-root-dir&gt;
</span></span><span class="line"><span class="cl">mkdir -p build-ios
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-ios
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/ios.toolchain.cmake -DIOS_PLATFORM<span class="o">=</span>OS -DIOS_ARCH<span class="o">=</span><span class="s2">&#34;armv7;arm64;arm64e&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DENABLE_BITCODE<span class="o">=</span><span class="m">0</span> -DENABLE_ARC<span class="o">=</span><span class="m">0</span> -DENABLE_VISIBILITY<span class="o">=</span><span class="m">0</span> <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DOpenMP_C_FLAGS<span class="o">=</span><span class="s2">&#34;-Xclang -fopenmp&#34;</span> -DOpenMP_CXX_FLAGS<span class="o">=</span><span class="s2">&#34;-Xclang -fopenmp&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DOpenMP_C_LIB_NAMES<span class="o">=</span><span class="s2">&#34;libomp&#34;</span> -DOpenMP_CXX_LIB_NAMES<span class="o">=</span><span class="s2">&#34;libomp&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DOpenMP_libomp_LIBRARY<span class="o">=</span><span class="s2">&#34;/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/usr/lib/libomp.a&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DNCNN_BUILD_BENCHMARK<span class="o">=</span>OFF ..
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># Vulkan 仅在 arm64 设备上可用</span>
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/ios.toolchain.cmake -DIOS_PLATFORM<span class="o">=</span>OS64 -DIOS_ARCH<span class="o">=</span><span class="s2">&#34;arm64;arm64e&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DENABLE_BITCODE<span class="o">=</span><span class="m">0</span> -DENABLE_ARC<span class="o">=</span><span class="m">0</span> -DENABLE_VISIBILITY<span class="o">=</span><span class="m">0</span> <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DOpenMP_C_FLAGS<span class="o">=</span><span class="s2">&#34;-Xclang -fopenmp&#34;</span> -DOpenMP_CXX_FLAGS<span class="o">=</span><span class="s2">&#34;-Xclang -fopenmp&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DOpenMP_C_LIB_NAMES<span class="o">=</span><span class="s2">&#34;libomp&#34;</span> -DOpenMP_CXX_LIB_NAMES<span class="o">=</span><span class="s2">&#34;libomp&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DOpenMP_libomp_LIBRARY<span class="o">=</span><span class="s2">&#34;/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/usr/lib/libomp.a&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DVulkan_INCLUDE_DIR<span class="o">=</span><span class="sb">`</span><span class="nb">pwd</span><span class="sb">`</span>/../vulkansdk-macos-1.2.189.0/MoltenVK/include <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DVulkan_LIBRARY<span class="o">=</span><span class="sb">`</span><span class="nb">pwd</span><span class="sb">`</span>/../vulkansdk-macos-1.2.189.0/MoltenVK/dylib/iOS/libMoltenVK.dylib <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DNCNN_VULKAN<span class="o">=</span>ON -DNCNN_BUILD_BENCHMARK<span class="o">=</span>OFF ..
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake --build . -j <span class="m">4</span>
</span></span><span class="line"><span class="cl">cmake --build . --target install
</span></span></code></pre></div><p>为 iPhoneSimulator 构建库：</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;ncnn-root-dir&gt;
</span></span><span class="line"><span class="cl">mkdir -p build-ios-sim
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-ios-sim
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/ios.toolchain.cmake -DIOS_PLATFORM<span class="o">=</span>SIMULATOR -DIOS_ARCH<span class="o">=</span><span class="s2">&#34;i386;x86_64&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DENABLE_BITCODE<span class="o">=</span><span class="m">0</span> -DENABLE_ARC<span class="o">=</span><span class="m">0</span> -DENABLE_VISIBILITY<span class="o">=</span><span class="m">0</span> <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DOpenMP_C_FLAGS<span class="o">=</span><span class="s2">&#34;-Xclang -fopenmp&#34;</span> -DOpenMP_CXX_FLAGS<span class="o">=</span><span class="s2">&#34;-Xclang -fopenmp&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DOpenMP_C_LIB_NAMES<span class="o">=</span><span class="s2">&#34;libomp&#34;</span> -DOpenMP_CXX_LIB_NAMES<span class="o">=</span><span class="s2">&#34;libomp&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DOpenMP_libomp_LIBRARY<span class="o">=</span><span class="s2">&#34;/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator.sdk/usr/lib/libomp.a&#34;</span> <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DNCNN_BUILD_BENCHMARK<span class="o">=</span>OFF ..
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">cmake --build . -j <span class="m">4</span>
</span></span><span class="line"><span class="cl">cmake --build . --target install
</span></span></code></pre></div><p>打包 glslang 框架：</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;ncnn-root-dir&gt;
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">mkdir -p glslang.framework/Versions/A/Headers
</span></span><span class="line"><span class="cl">mkdir -p glslang.framework/Versions/A/Resources
</span></span><span class="line"><span class="cl">ln -s A glslang.framework/Versions/Current
</span></span><span class="line"><span class="cl">ln -s Versions/Current/Headers glslang.framework/Headers
</span></span><span class="line"><span class="cl">ln -s Versions/Current/Resources glslang.framework/Resources
</span></span><span class="line"><span class="cl">ln -s Versions/Current/glslang glslang.framework/glslang
</span></span><span class="line"><span class="cl">libtool -static build-ios/install/lib/libglslang.a build-ios/install/lib/libSPIRV.a build-ios/install/lib/libOGLCompiler.a build-ios/install/lib/libOSDependent.a -o build-ios/install/lib/libglslang_combined.a
</span></span><span class="line"><span class="cl">libtool -static build-ios-sim/install/lib/libglslang.a build-ios-sim/install/lib/libSPIRV.a build-ios-sim/install/lib/libOGLCompiler.a build-ios-sim/install/lib/libOSDependent.a -o build-ios-sim/install/lib/libglslang_combined.a
</span></span><span class="line"><span class="cl">lipo -create build-ios/install/lib/libglslang_combined.a build-ios-sim/install/lib/libglslang_combined.a -o glslang.framework/Versions/A/glslang
</span></span><span class="line"><span class="cl">cp -r build/install/include/glslang glslang.framework/Versions/A/Headers/
</span></span><span class="line"><span class="cl">sed -e <span class="s1">&#39;s/**NAME**/glslang/g&#39;</span> -e <span class="s1">&#39;s/**IDENTIFIER**/org.khronos.glslang/g&#39;</span> -e <span class="s1">&#39;s/**VERSION**/1.0/g&#39;</span> Info.plist &gt; glslang.framework/Versions/A/Resources/Info.plist
</span></span></code></pre></div><p>打包 ncnn 框架：</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl"><span class="nb">cd</span> &lt;ncnn-root-dir&gt;
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl">mkdir -p ncnn.framework/Versions/A/Headers
</span></span><span class="line"><span class="cl">mkdir -p ncnn.framework/Versions/A/Resources
</span></span><span class="line"><span class="cl">ln -s A ncnn.framework/Versions/Current
</span></span><span class="line"><span class="cl">ln -s Versions/Current/Headers ncnn.framework/Headers
</span></span><span class="line"><span class="cl">ln -s Versions/Current/Resources ncnn.framework/Resources
</span></span><span class="line"><span class="cl">ln -s Versions/Current/ncnn ncnn.framework/ncnn
</span></span><span class="line"><span class="cl">lipo -create build-ios/install/lib/libncnn.a build-ios-sim/install/lib/libncnn.a -o ncnn.framework/Versions/A/ncnn
</span></span><span class="line"><span class="cl">cp -r build-ios/install/include/<span class="se">\*</span> ncnn.framework/Versions/A/Headers/
</span></span><span class="line"><span class="cl">sed -e <span class="s1">&#39;s/**NAME**/ncnn/g&#39;</span> -e <span class="s1">&#39;s/**IDENTIFIER**/com.tencent.ncnn/g&#39;</span> -e <span class="s1">&#39;s/**VERSION**/1.0/g&#39;</span> Info.plist &gt; ncnn.framework/Versions/A/Resources/Info.plist
</span></span></code></pre></div><p>选择 <code>ncnn.framework</code>、<code>glslang.framework</code> 和 <code>openmp.framework</code> 文件夹进行应用开发。</p>
<hr>
<h3 id="为-webassembly-构建">为 WebAssembly 构建</h3>
<p>安装 Emscripten</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">git clone https://github.com/emscripten-core/emsdk.git
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> emsdk
</span></span><span class="line"><span class="cl">./emsdk install 2.0.8
</span></span><span class="line"><span class="cl">./emsdk activate 2.0.8
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="nb">source</span> emsdk/emsdk_env.sh
</span></span></code></pre></div><p>不带任何扩展名构建以实现通用兼容性：</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">mkdir -p build
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../emsdk/upstream/emscripten/cmake/Modules/Platform/Emscripten.cmake <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DNCNN_THREADS<span class="o">=</span>OFF -DNCNN_OPENMP<span class="o">=</span>OFF -DNCNN_SIMPLEOMP<span class="o">=</span>OFF -DNCNN_RUNTIME_CPU<span class="o">=</span>OFF -DNCNN_SSE2<span class="o">=</span>OFF -DNCNN_AVX2<span class="o">=</span>OFF -DNCNN_AVX<span class="o">=</span>OFF <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DNCNN_BUILD_TOOLS<span class="o">=</span>OFF -DNCNN_BUILD_EXAMPLES<span class="o">=</span>OFF -DNCNN_BUILD_BENCHMARK<span class="o">=</span>OFF ..
</span></span><span class="line"><span class="cl">cmake --build . -j <span class="m">4</span>
</span></span><span class="line"><span class="cl">cmake --build . --target install
</span></span></code></pre></div><p>带 WASM SIMD 扩展构建：</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">mkdir -p build-simd
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-simd
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../emsdk/upstream/emscripten/cmake/Modules/Platform/Emscripten.cmake <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DNCNN_THREADS<span class="o">=</span>OFF -DNCNN_OPENMP<span class="o">=</span>OFF -DNCNN_SIMPLEOMP<span class="o">=</span>OFF -DNCNN_RUNTIME_CPU<span class="o">=</span>OFF -DNCNN_SSE2<span class="o">=</span>ON -DNCNN_AVX2<span class="o">=</span>OFF -DNCNN_AVX<span class="o">=</span>OFF <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DNCNN_BUILD_TOOLS<span class="o">=</span>OFF -DNCNN_BUILD_EXAMPLES<span class="o">=</span>OFF -DNCNN_BUILD_BENCHMARK<span class="o">=</span>OFF ..
</span></span><span class="line"><span class="cl">cmake --build . -j <span class="m">4</span>
</span></span><span class="line"><span class="cl">cmake --build . --target install
</span></span></code></pre></div><p>带 WASM Thread 扩展构建：</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">mkdir -p build-threads
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-threads
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../emsdk/upstream/emscripten/cmake/Modules/Platform/Emscripten.cmake <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DNCNN_THREADS<span class="o">=</span>ON -DNCNN_OPENMP<span class="o">=</span>ON -DNCNN_SIMPLEOMP<span class="o">=</span>ON -DNCNN_RUNTIME_CPU<span class="o">=</span>OFF -DNCNN_SSE2<span class="o">=</span>OFF -DNCNN_AVX2<span class="o">=</span>OFF -DNCNN_AVX<span class="o">=</span>OFF <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DNCNN_BUILD_TOOLS<span class="o">=</span>OFF -DNCNN_BUILD_EXAMPLES<span class="o">=</span>OFF -DNCNN_BUILD_BENCHMARK<span class="o">=</span>OFF ..
</span></span><span class="line"><span class="cl">cmake --build . -j <span class="m">4</span>
</span></span><span class="line"><span class="cl">cmake --build . --target install
</span></span></code></pre></div><p>带 WASM SIMD 和 Thread 扩展构建：</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">mkdir -p build-simd-threads
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-simd-threads
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../emsdk/upstream/emscripten/cmake/Modules/Platform/Emscripten.cmake <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DNCNN_THREADS<span class="o">=</span>ON -DNCNN_OPENMP<span class="o">=</span>ON -DNCNN_SIMPLEOMP<span class="o">=</span>ON -DNCNN_RUNTIME_CPU<span class="o">=</span>OFF -DNCNN_SSE2<span class="o">=</span>ON -DNCNN_AVX2<span class="o">=</span>OFF -DNCNN_AVX<span class="o">=</span>OFF <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DNCNN_BUILD_TOOLS<span class="o">=</span>OFF -DNCNN_BUILD_EXAMPLES<span class="o">=</span>OFF -DNCNN_BUILD_BENCHMARK<span class="o">=</span>OFF ..
</span></span><span class="line"><span class="cl">cmake --build . -j <span class="m">4</span>
</span></span><span class="line"><span class="cl">cmake --build . --target install
</span></span></code></pre></div><p>选择 <code>build-XYZ/install</code> 文件夹用于进一步使用。</p>
<hr>
<h3 id="为-allwinner-d1-构建">为 AllWinner D1 构建</h3>
<p>从 <a href="https://occ.t-head.cn/community/download?id=3913221581316624384">https://occ.t-head.cn/community/download?id=3913221581316624384</a> 下载 c906 工具链包。</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">tar -xf riscv64-linux-x86_64-20210512.tar.gz
</span></span><span class="line"><span class="cl"><span class="nb">export</span> <span class="nv">RISCV_ROOT_PATH</span><span class="o">=</span>/home/nihui/osd/riscv64-linux-x86_64-20210512
</span></span></code></pre></div><p>启用 riscv-v 向量和 simpleocv 的 ncnn 构建：</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">mkdir -p build-c906
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build-c906
</span></span><span class="line"><span class="cl">cmake -DCMAKE_TOOLCHAIN_FILE<span class="o">=</span>../toolchains/c906.toolchain.cmake <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DCMAKE_BUILD_TYPE<span class="o">=</span>relwithdebinfo -DNCNN_OPENMP<span class="o">=</span>OFF -DNCNN_THREADS<span class="o">=</span>OFF -DNCNN_RUNTIME_CPU<span class="o">=</span>OFF -DNCNN_RVV<span class="o">=</span>ON <span class="se">\
</span></span></span><span class="line"><span class="cl"> -DNCNN_SIMPLEOCV<span class="o">=</span>ON -DNCNN_BUILD_EXAMPLES<span class="o">=</span>ON ..
</span></span><span class="line"><span class="cl">cmake --build . -j <span class="m">4</span>
</span></span><span class="line"><span class="cl">cmake --build . --target install
</span></span></code></pre></div><p>选择 <code>build-c906/install</code> 文件夹用于进一步使用。</p>
<p>你可以将二进制文件上传到 <code>build-c906/examples</code> 文件夹并在 D1 板上运行进行测试。</p>
<hr>
<h3 id="为-loongson-2k1000-构建">为 Loongson 2K1000 构建</h3>
<p>对于 gcc 版本 &lt;= 8.5，你需要修复 msa.h 头文件以解决 msa fmadd 错误。</p>
<p>打开 <code>/usr/lib/gcc/mips64el-linux-gnuabi64/8/include/msa.h</code>，找到 <code>__msa_fmadd_w</code> 并应用以下更改：</p>
<pre tabindex="0"><code>// #define **msa_fmadd_w **builtin_msa_fmadd_w
#define **msa_fmadd_w(a, b, c) **builtin_msa_fmadd_w(c, b, a)
</code></pre><p>启用 mips msa 和 simpleocv 的 ncnn 构建：</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">mkdir -p build
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build
</span></span><span class="line"><span class="cl">cmake -DNCNN_DISABLE_RTTI<span class="o">=</span>ON -DNCNN_DISABLE_EXCEPTION<span class="o">=</span>ON -DNCNN_RUNTIME_CPU<span class="o">=</span>OFF -DNCNN_MSA<span class="o">=</span>ON -DNCNN_MMI<span class="o">=</span>ON -DNCNN_SIMPLEOCV<span class="o">=</span>ON ..
</span></span><span class="line"><span class="cl">cmake --build . -j <span class="m">2</span>
</span></span><span class="line"><span class="cl">cmake --build . --target install
</span></span></code></pre></div><p>选择 <code>build/install</code> 文件夹用于进一步使用。</p>
<p>你可以运行 <code>build/examples</code> 文件夹中的二进制文件进行测试。</p>
<hr>
<h3 id="为-android-上的-termux-构建">为 Android 上的 Termux 构建</h3>
<p>在手机上安装 Termux 应用，并在 Termux 中安装 Ubuntu。</p>
<p>如果你想使用 ssh，只需在 Termux 中安装 openssh。</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">pkg install proot-distro
</span></span><span class="line"><span class="cl">proot-distro install ubuntu
</span></span></code></pre></div><p>或者你可以使用 <code>proot-distro list</code> 查看可以安装的系统。</p>
<p>在成功安装 Ubuntu 后，使用 <code>proot-distro login ubuntu</code> 登录 Ubuntu。</p>
<p>然后构建 ncnn，无需安装任何其他依赖项。</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">git clone https://github.com/Tencent/ncnn.git
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> ncnn
</span></span><span class="line"><span class="cl">git submodule update --init
</span></span><span class="line"><span class="cl">mkdir -p build
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> build
</span></span><span class="line"><span class="cl">cmake -DCMAKE_BUILD_TYPE<span class="o">=</span>Release -DNCNN_BUILD_EXAMPLES<span class="o">=</span>ON -DNCNN_PLATFORM_API<span class="o">=</span>OFF -DNCNN_SIMPLEOCV<span class="o">=</span>ON ..
</span></span><span class="line"><span class="cl">make -j<span class="k">$(</span>nproc<span class="k">)</span>
</span></span></code></pre></div><p>然后你可以运行测试：</p>
<blockquote>
<p>在我的 Pixel 3 XL 上使用 Qualcomm 845，无法加载 <code>256-ncnn.png</code></p>
</blockquote>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl"><span class="nb">cd</span> ../examples
</span></span><span class="line"><span class="cl">../build/examples/squeezenet ../images/128-ncnn.png
</span></span></code></pre></div>]]></content:encoded>
    </item>
    <item>
      <title>在 Android 11 上用 LLDV 调试原生应用</title>
      <link>https://xgdebug.com/zh/posts/tech/debug/debugging-native-apps-with-lldb-on-android-11/</link>
      <pubDate>Fri, 11 Aug 2023 01:14:06 +0000</pubDate>
      <guid>https://xgdebug.com/zh/posts/tech/debug/debugging-native-apps-with-lldb-on-android-11/</guid>
      <description>&lt;h2 id=&#34;mobile&#34;&gt;Mobile&lt;/h2&gt;
&lt;h2 id=&#34;将-push-调试服务器推送到手机&#34;&gt;将 PUSH 调试服务器推送到手机&lt;/h2&gt;
&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;adb push lldb-server /data/local/tmp
chmod 755 /data/local/tmp/lldb-server
&lt;/code&gt;&lt;/pre&gt;&lt;h2 id=&#34;启动调试器服务&#34;&gt;启动调试器服务&lt;/h2&gt;
&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;/data/local/tmp/lldb-server platform --listen &amp;#34;*:8888&amp;#34; --server
&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;+++&lt;/p&gt;
&lt;h2 id=&#34;电脑端&#34;&gt;电脑端&lt;/h2&gt;
&lt;h2 id=&#34;端口转发&#34;&gt;端口转发&lt;/h2&gt;
&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;adb forward tcp:8888 tcp:8888
&lt;/code&gt;&lt;/pre&gt;&lt;h2 id=&#34;启动-lldb&#34;&gt;启动 LLDB&lt;/h2&gt;
&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;. \lldb.exe
&lt;/code&gt;&lt;/pre&gt;&lt;h2 id=&#34;查看支持的平台&#34;&gt;查看支持的平台&lt;/h2&gt;
&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;platform list
&lt;/code&gt;&lt;/pre&gt;&lt;h2 id=&#34;选择-android-平台&#34;&gt;选择 ANDROID 平台&lt;/h2&gt;
&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;platform select remote-android
&lt;/code&gt;&lt;/pre&gt;&lt;h2 id=&#34;连接到手机&#34;&gt;连接到手机&lt;/h2&gt;
&lt;p&gt;手机序列号: &lt;strong&gt;9643e0ec0604&lt;/strong&gt; 要更改为当前调试的手机，请使用 &lt;code&gt;adb devices&lt;/code&gt; 检查序列号&lt;/p&gt;</description>
      <content:encoded><![CDATA[<h2 id="mobile">Mobile</h2>
<h2 id="将-push-调试服务器推送到手机">将 PUSH 调试服务器推送到手机</h2>
<pre tabindex="0"><code>adb push lldb-server /data/local/tmp
chmod 755 /data/local/tmp/lldb-server
</code></pre><h2 id="启动调试器服务">启动调试器服务</h2>
<pre tabindex="0"><code>/data/local/tmp/lldb-server platform --listen &#34;*:8888&#34; --server
</code></pre><p>+++</p>
<h2 id="电脑端">电脑端</h2>
<h2 id="端口转发">端口转发</h2>
<pre tabindex="0"><code>adb forward tcp:8888 tcp:8888
</code></pre><h2 id="启动-lldb">启动 LLDB</h2>
<pre tabindex="0"><code>. \lldb.exe
</code></pre><h2 id="查看支持的平台">查看支持的平台</h2>
<pre tabindex="0"><code>platform list
</code></pre><h2 id="选择-android-平台">选择 ANDROID 平台</h2>
<pre tabindex="0"><code>platform select remote-android
</code></pre><h2 id="连接到手机">连接到手机</h2>
<p>手机序列号: <strong>9643e0ec0604</strong> 要更改为当前调试的手机，请使用 <code>adb devices</code> 检查序列号</p>
<pre tabindex="0"><code>platform connect connect://9643e0ec0604:8888
</code></pre><h2 id="查看当前运行的进程">查看当前运行的进程</h2>
<pre tabindex="0"><code>platform process list
</code></pre><h2 id="附加到">附加到</h2>
<pre tabindex="0"><code>attach 9053
</code></pre><h2 id="断点">断点</h2>
<pre tabindex="0"><code>b send
</code></pre><h2 id="运行-继续执行">运行 (继续执行)</h2>
<pre tabindex="0"><code>c
</code></pre><h2 id="查看线程列表">查看线程列表</h2>
<pre tabindex="0"><code>thread list
</code></pre><h2 id="查看调用堆栈">查看调用堆栈</h2>
<pre tabindex="0"><code>bt
</code></pre>]]></content:encoded>
    </item>
    <item>
      <title>一个现代化APP的开发通常会涉及以下几个步骤和工具</title>
      <link>https://xgdebug.com/zh/posts/tech/app/modern-app-development-steps-tools/</link>
      <pubDate>Sun, 16 Jul 2023 01:14:02 +0000</pubDate>
      <guid>https://xgdebug.com/zh/posts/tech/app/modern-app-development-steps-tools/</guid>
      <description>&lt;h2 id=&#34;需求分析&#34;&gt;需求分析&lt;/h2&gt;
&lt;p&gt;分析目标用户的需求,确定 APP 要实现的核心功能和特色。这需要进行用户研究、竞品分析等。&lt;br&gt;
信息架构和交互设计
根据需求和用户研究,设计 APP 的信息架构,确定界面流程和交互逻辑。常用的设计工具有 Axure、Sketch 等。&lt;/p&gt;</description>
      <content:encoded><![CDATA[<h2 id="需求分析">需求分析</h2>
<p>分析目标用户的需求,确定 APP 要实现的核心功能和特色。这需要进行用户研究、竞品分析等。<br>
信息架构和交互设计
根据需求和用户研究,设计 APP 的信息架构,确定界面流程和交互逻辑。常用的设计工具有 Axure、Sketch 等。</p>
<h2 id="视觉设计">视觉设计</h2>
<p>进行 APP 的视觉设计,包括界面样式、图标、颜色、字体等。常用工具有 Photoshop、Illustrator 等。</p>
<h2 id="前端开发">前端开发</h2>
<p>使用前端开发框架开发 APP 界面,比如 React Native、Flutter 等。需要对 JavaScript、Dart 等语言熟练。</p>
<h2 id="后端开发">后端开发</h2>
<p>开发 APP 后端业务逻辑,提供接口和数据支持。使用 PHP、Java、Python 等服务器语言,以及 MySQL、MongoDB 等数据库。</p>
<h2 id="测试">测试</h2>
<p>在开发过程中进行功能测试、界面测试、性能测试、安全测试等,确保 APP 质量。使用工具如 Appium、JMeter 等。</p>
<h2 id="发布和运维">发布和运维</h2>
<p>将 APP 发布到 App Store 和 Google Play,并进行持续监控和后期优化升级。使用平台如 Firebase。</p>
<p>所以现代 APP 开发需要多学科协作,也需要掌握各种专业工具,才能开发出用户喜欢的产品。这个过程需要设计、开发和测试人员通力合作。</p>
]]></content:encoded>
    </item>
    <item>
      <title>一个现代的APP是如何诞生的？</title>
      <link>https://xgdebug.com/zh/posts/tech/app/how-a-modern-app-is-born/</link>
      <pubDate>Fri, 14 Jul 2023 05:07:13 +0000</pubDate>
      <guid>https://xgdebug.com/zh/posts/tech/app/how-a-modern-app-is-born/</guid>
      <description>&lt;p&gt;一个现代的 APP 是如何诞生的？&lt;/p&gt;
&lt;p&gt;从创意到产品上线运营所的所有步骤与工作岗位和他们需要用到的工具：&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;创意&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;
&lt;ul&gt;
&lt;li&gt;产品经理：负责制定产品的目标、功能和用户体验。&lt;/li&gt;
&lt;li&gt;设计师：负责设计产品的界面和交互。&lt;/li&gt;
&lt;li&gt;开发人员：负责开发产品的代码。&lt;/li&gt;
&lt;/ul&gt;
&lt;ol start=&#34;2&#34;&gt;
&lt;li&gt;&lt;strong&gt;开发&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;
&lt;ul&gt;
&lt;li&gt;开发人员：负责开发产品的代码。&lt;/li&gt;
&lt;li&gt;测试人员：负责测试产品的功能和性能。&lt;/li&gt;
&lt;li&gt;质量保证工程师：负责确保产品达到质量标准。&lt;/li&gt;
&lt;/ul&gt;
&lt;ol start=&#34;3&#34;&gt;
&lt;li&gt;&lt;strong&gt;上线&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;
&lt;ul&gt;
&lt;li&gt;产品经理：负责制定产品上线的策略。&lt;/li&gt;
&lt;li&gt;运营人员：负责产品上线后的运营工作，包括推广、营销、客服等。&lt;/li&gt;
&lt;/ul&gt;
&lt;ol start=&#34;4&#34;&gt;
&lt;li&gt;&lt;strong&gt;运营&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;
&lt;ul&gt;
&lt;li&gt;运营人员：负责产品上线后的运营工作，包括推广、营销、客服等。&lt;/li&gt;
&lt;li&gt;数据分析师：负责分析产品的用户数据，并根据数据做出改进。&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;以下是一些常用的工具：&lt;/p&gt;</description>
      <content:encoded><![CDATA[<p>一个现代的 APP 是如何诞生的？</p>
<p>从创意到产品上线运营所的所有步骤与工作岗位和他们需要用到的工具：</p>
<ol>
<li><strong>创意</strong></li>
</ol>
<ul>
<li>产品经理：负责制定产品的目标、功能和用户体验。</li>
<li>设计师：负责设计产品的界面和交互。</li>
<li>开发人员：负责开发产品的代码。</li>
</ul>
<ol start="2">
<li><strong>开发</strong></li>
</ol>
<ul>
<li>开发人员：负责开发产品的代码。</li>
<li>测试人员：负责测试产品的功能和性能。</li>
<li>质量保证工程师：负责确保产品达到质量标准。</li>
</ul>
<ol start="3">
<li><strong>上线</strong></li>
</ol>
<ul>
<li>产品经理：负责制定产品上线的策略。</li>
<li>运营人员：负责产品上线后的运营工作，包括推广、营销、客服等。</li>
</ul>
<ol start="4">
<li><strong>运营</strong></li>
</ol>
<ul>
<li>运营人员：负责产品上线后的运营工作，包括推广、营销、客服等。</li>
<li>数据分析师：负责分析产品的用户数据，并根据数据做出改进。</li>
</ul>
<p>以下是一些常用的工具：</p>
<ul>
<li>产品管理工具：Jira、Asana、Trello</li>
<li>设计工具：Sketch、Figma、Adobe XD</li>
<li>开发工具：Xcode、Android Studio、Visual Studio</li>
<li>测试工具：JUnit、Selenium、Xcode UI Testing</li>
<li>质量保证工具：SonarQube、Codeship、Travis CI</li>
<li>营销工具：Google Ads、Facebook Ads、Twitter Ads</li>
<li>客服工具：Zendesk、Intercom、HubSpot</li>
</ul>
<p><strong>注意</strong>：以上只是一个概述，具体的步骤和工作岗位可能会有所不同，具体情况要根据产品的不同而定。</p>
]]></content:encoded>
    </item>
  </channel>
</rss>
