<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
    <channel>
        <title>Segmentation on Producthunt daily</title>
        <link>https://producthunt.programnotes.cn/en/tags/segmentation/</link>
        <description>Recent content in Segmentation on Producthunt daily</description>
        <generator>Hugo -- gohugo.io</generator>
        <language>en</language>
        <lastBuildDate>Sat, 20 Sep 2025 15:25:15 +0800</lastBuildDate><atom:link href="https://producthunt.programnotes.cn/en/tags/segmentation/index.xml" rel="self" type="application/rss+xml" /><item>
        <title>detectron2</title>
        <link>https://producthunt.programnotes.cn/en/p/detectron2/</link>
        <pubDate>Sat, 20 Sep 2025 15:25:15 +0800</pubDate>
        
        <guid>https://producthunt.programnotes.cn/en/p/detectron2/</guid>
        <description>&lt;img src="https://images.unsplash.com/photo-1677146138576-be7fd85bba89?ixid=M3w0NjAwMjJ8MHwxfHJhbmRvbXx8fHx8fHx8fDE3NTgzNTMwNTN8&amp;ixlib=rb-4.1.0" alt="Featured image of post detectron2" /&gt;&lt;h1 id=&#34;facebookresearchdetectron2&#34;&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/facebookresearch/detectron2&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;facebookresearch/detectron2&lt;/a&gt;
&lt;/h1&gt;&lt;img src=&#34;.github/Detectron2-Logo-Horz.svg&#34; width=&#34;300&#34; &gt;
&lt;p&gt;Detectron2 is Facebook AI Research&amp;rsquo;s next generation library
that provides state-of-the-art detection and segmentation algorithms.
It is the successor of
&lt;a class=&#34;link&#34; href=&#34;https://github.com/facebookresearch/Detectron/&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;Detectron&lt;/a&gt;
and &lt;a class=&#34;link&#34; href=&#34;https://github.com/facebookresearch/maskrcnn-benchmark/&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;maskrcnn-benchmark&lt;/a&gt;.
It supports a number of computer vision research projects and production applications in Facebook.&lt;/p&gt;
&lt;div align=&#34;center&#34;&gt;
  &lt;img src=&#34;https://user-images.githubusercontent.com/1381301/66535560-d3422200-eace-11e9-9123-5535d469db19.png&#34;/&gt;
&lt;/div&gt;
&lt;br&gt;
&lt;h2 id=&#34;learn-more-about-detectron2&#34;&gt;Learn More about Detectron2
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;Includes new capabilities such as panoptic segmentation, Densepose, Cascade R-CNN, rotated bounding boxes, PointRend,
DeepLab, ViTDet, MViTv2 etc.&lt;/li&gt;
&lt;li&gt;Used as a library to support building &lt;a class=&#34;link&#34; href=&#34;projects/&#34; &gt;research projects&lt;/a&gt; on top of it.&lt;/li&gt;
&lt;li&gt;Models can be exported to TorchScript format or Caffe2 format for deployment.&lt;/li&gt;
&lt;li&gt;It &lt;a class=&#34;link&#34; href=&#34;https://detectron2.readthedocs.io/notes/benchmarks.html&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;trains much faster&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;See our &lt;a class=&#34;link&#34; href=&#34;https://ai.meta.com/blog/-detectron2-a-pytorch-based-modular-object-detection-library-/&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;blog post&lt;/a&gt;
to see more demos.
See this &lt;a class=&#34;link&#34; href=&#34;https://ai.meta.com/blog/detectron-everingham-prize/&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;interview&lt;/a&gt; to learn more about the stories behind detectron2.&lt;/p&gt;
&lt;h2 id=&#34;installation&#34;&gt;Installation
&lt;/h2&gt;&lt;p&gt;See &lt;a class=&#34;link&#34; href=&#34;https://detectron2.readthedocs.io/tutorials/install.html&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;installation instructions&lt;/a&gt;.&lt;/p&gt;
&lt;h2 id=&#34;getting-started&#34;&gt;Getting Started
&lt;/h2&gt;&lt;p&gt;See &lt;a class=&#34;link&#34; href=&#34;https://detectron2.readthedocs.io/tutorials/getting_started.html&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;Getting Started with Detectron2&lt;/a&gt;,
and the &lt;a class=&#34;link&#34; href=&#34;https://colab.research.google.com/drive/16jcaJoc6bCFAQ96jDe2HwtXj7BMD_-m5&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;Colab Notebook&lt;/a&gt;
to learn about basic usage.&lt;/p&gt;
&lt;p&gt;Learn more at our &lt;a class=&#34;link&#34; href=&#34;https://detectron2.readthedocs.org&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;documentation&lt;/a&gt;.
And see &lt;a class=&#34;link&#34; href=&#34;projects/&#34; &gt;projects/&lt;/a&gt; for some projects that are built on top of detectron2.&lt;/p&gt;
&lt;h2 id=&#34;model-zoo-and-baselines&#34;&gt;Model Zoo and Baselines
&lt;/h2&gt;&lt;p&gt;We provide a large set of baseline results and trained models available for download in the &lt;a class=&#34;link&#34; href=&#34;MODEL_ZOO.md&#34; &gt;Detectron2 Model Zoo&lt;/a&gt;.&lt;/p&gt;
&lt;h2 id=&#34;license&#34;&gt;License
&lt;/h2&gt;&lt;p&gt;Detectron2 is released under the &lt;a class=&#34;link&#34; href=&#34;LICENSE&#34; &gt;Apache 2.0 license&lt;/a&gt;.&lt;/p&gt;
&lt;h2 id=&#34;citing-detectron2&#34;&gt;Citing Detectron2
&lt;/h2&gt;&lt;p&gt;If you use Detectron2 in your research or wish to refer to the baseline results published in the &lt;a class=&#34;link&#34; href=&#34;MODEL_ZOO.md&#34; &gt;Model Zoo&lt;/a&gt;, please use the following BibTeX entry.&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt;1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;7
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-BibTeX&#34; data-lang=&#34;BibTeX&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nc&#34;&gt;@misc&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;&lt;span class=&#34;nl&#34;&gt;wu2019detectron2&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;na&#34;&gt;author&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;=&lt;/span&gt;       &lt;span class=&#34;s&#34;&gt;{Yuxin Wu and Alexander Kirillov and Francisco Massa and
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;s&#34;&gt;                  Wan-Yen Lo and Ross Girshick}&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;na&#34;&gt;title&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;=&lt;/span&gt;        &lt;span class=&#34;s&#34;&gt;{Detectron2}&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;na&#34;&gt;howpublished&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;s&#34;&gt;{\url{https://github.com/facebookresearch/detectron2}}&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;na&#34;&gt;year&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;=&lt;/span&gt;         &lt;span class=&#34;s&#34;&gt;{2019}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;</description>
        </item>
        
    </channel>
</rss>
