ZIPyolov5+csl标签.(Oriented Object Detection)(Rotation Detection)(Ro 6.18MB

2401_89451588需要积分:8(1积分=1元)

资源文件列表:

yolov5+csl标签.(Oriented Object Detection)(Rotation Detection)(Rotated BBox)基于yolov5的旋转目标检测_yolov5_obb 大约有148个文件
  1. yolov5 + csl_label.(Oriented Object Detection)(Rotation Detection)(Rotated BBox)基于yolov5的旋转目标检测_yolov5_obb/项目内附说明/如果解压失败请用ara软件解压.txt 42B
  2. yolov5_obb-master/Arial.ttf 755.11KB
  3. yolov5_obb-master/CONTRIBUTING.md 4.87KB
  4. yolov5_obb-master/detect.py 12.77KB
  5. yolov5_obb-master/Dockerfile 2.11KB
  6. yolov5_obb-master/export.py 21.26KB
  7. yolov5_obb-master/hubconf.py 6.23KB
  8. yolov5_obb-master/LICENSE 34.3KB
  9. yolov5_obb-master/README.md 4.62KB
  10. yolov5_obb-master/requirements.txt 926B
  11. yolov5_obb-master/setup.cfg 923B
  12. yolov5_obb-master/test.txt 6.94KB
  13. yolov5_obb-master/train.py 32.65KB
  14. yolov5_obb-master/tutorial.ipynb 55.67KB
  15. yolov5_obb-master/val.py 20.12KB
  16. yolov5_obb-master/data/dotav15_poly.yaml 847B
  17. yolov5_obb-master/data/dotav1_poly.yaml 819B
  18. yolov5_obb-master/data/DroneVehicle_poly.yaml 550B
  19. yolov5_obb-master/data/yolov5obb_demo.yaml 791B
  20. yolov5_obb-master/data/yolov5obb_demo_split.yaml 866B
  21. yolov5_obb-master/data/hyps/obb/hyp.finetune_dota.yaml 480B
  22. yolov5_obb-master/data/hyps/obb/hyp.finetune_dota_CloseAug.yaml 520B
  23. yolov5_obb-master/data/hyps/obb/hyp.finetune_DroneVehicle.yaml 485B
  24. yolov5_obb-master/data/hyps/obb/hyp.paper.yaml 488B
  25. yolov5_obb-master/data/scripts/download_weights.sh 523B
  26. yolov5_obb-master/dataset/dataset_demo/imgnamefile.txt 6B
  27. yolov5_obb-master/dataset/dataset_demo/images/P0032.png 5.3MB
  28. yolov5_obb-master/dataset/dataset_demo/labelTxt/P0032.txt 3.49KB
  29. yolov5_obb-master/docs/ChangeLog.md 1.07KB
  30. yolov5_obb-master/docs/detection.png 296.46KB
  31. yolov5_obb-master/docs/GetStart.md 6.65KB
  32. yolov5_obb-master/docs/install.md 1.67KB
  33. yolov5_obb-master/docs/results.png 110.54KB
  34. yolov5_obb-master/docs/train_batch6.jpg 87.7KB
  35. yolov5_obb-master/docs/YOLOv5_README.md 14.4KB
  36. yolov5_obb-master/DOTA_devkit/DOTA.py 4.15KB
  37. yolov5_obb-master/DOTA_devkit/DOTA2COCO.py 5.57KB
  38. yolov5_obb-master/DOTA_devkit/DOTA2JSON.py 3.67KB
  39. yolov5_obb-master/DOTA_devkit/dota_evaluation_task1.py 13.28KB
  40. yolov5_obb-master/DOTA_devkit/dota_evaluation_task2.py 9.87KB
  41. yolov5_obb-master/DOTA_devkit/dota_poly2rbox.py 7.66KB
  42. yolov5_obb-master/DOTA_devkit/dota_utils.py 10.18KB
  43. yolov5_obb-master/DOTA_devkit/hrsc2016_evaluation.py 10.74KB
  44. yolov5_obb-master/DOTA_devkit/ImgSplit.py 9.99KB
  45. yolov5_obb-master/DOTA_devkit/ImgSplit_multi_process.py 11.77KB
  46. yolov5_obb-master/DOTA_devkit/mAOE_evaluation.py 7.91KB
  47. yolov5_obb-master/DOTA_devkit/polyiou.cpp 3.88KB
  48. yolov5_obb-master/DOTA_devkit/polyiou.h 202B
  49. yolov5_obb-master/DOTA_devkit/polyiou.i 258B
  50. yolov5_obb-master/DOTA_devkit/polyiou.py 7.58KB
  51. yolov5_obb-master/DOTA_devkit/polyiou_wrap.cxx 263.78KB
  52. yolov5_obb-master/DOTA_devkit/prepare_dota1_ms.py 3.49KB
  53. yolov5_obb-master/DOTA_devkit/prepare_hrsc2016.py 714B
  54. yolov5_obb-master/DOTA_devkit/ResultEnsembleNMS_multi_process.py 9.96KB
  55. yolov5_obb-master/DOTA_devkit/ResultMerge.py 5.68KB
  56. yolov5_obb-master/DOTA_devkit/ResultMerge_multi_process.py 9.81KB
  57. yolov5_obb-master/DOTA_devkit/results_ensemble.py 2.46KB
  58. yolov5_obb-master/DOTA_devkit/results_obb2hbb.py 2.27KB
  59. yolov5_obb-master/DOTA_devkit/setup.py 445B
  60. yolov5_obb-master/DOTA_devkit/SplitOnlyImage.py 2.32KB
  61. yolov5_obb-master/DOTA_devkit/SplitOnlyImage_multi_process.py 3.7KB
  62. yolov5_obb-master/DOTA_devkit/ucasaod_evaluation.py 10.65KB
  63. yolov5_obb-master/DOTA_devkit/__init__.py
  64. yolov5_obb-master/DOTA_devkit/poly_nms_gpu/Makefile 56B
  65. yolov5_obb-master/DOTA_devkit/poly_nms_gpu/nms_wrapper.py 560B
  66. yolov5_obb-master/DOTA_devkit/poly_nms_gpu/poly_nms.cpp 344.34KB
  67. yolov5_obb-master/DOTA_devkit/poly_nms_gpu/poly_nms.hpp 298B
  68. yolov5_obb-master/DOTA_devkit/poly_nms_gpu/poly_nms.pyx 875B
  69. yolov5_obb-master/DOTA_devkit/poly_nms_gpu/poly_nms_kernel.cu 10.72KB
  70. yolov5_obb-master/DOTA_devkit/poly_nms_gpu/poly_nms_test.py
  71. yolov5_obb-master/DOTA_devkit/poly_nms_gpu/poly_overlaps.cpp 327.72KB
  72. yolov5_obb-master/DOTA_devkit/poly_nms_gpu/poly_overlaps.hpp 106B
  73. yolov5_obb-master/DOTA_devkit/poly_nms_gpu/poly_overlaps.pyx 552B
  74. yolov5_obb-master/DOTA_devkit/poly_nms_gpu/poly_overlaps_kernel.cu 12.54KB
  75. yolov5_obb-master/DOTA_devkit/poly_nms_gpu/setup.py 5.89KB
  76. yolov5_obb-master/DOTA_devkit/poly_nms_gpu/__init__.py 82B
  77. yolov5_obb-master/models/common.py 29.82KB
  78. yolov5_obb-master/models/experimental.py 4.48KB
  79. yolov5_obb-master/models/tf.py 20.23KB
  80. yolov5_obb-master/models/yolo.py 15.74KB
  81. yolov5_obb-master/models/yolov5l.yaml 1.37KB
  82. yolov5_obb-master/models/yolov5m.yaml 1.37KB
  83. yolov5_obb-master/models/yolov5n.yaml 1.37KB
  84. yolov5_obb-master/models/yolov5s.yaml 1.37KB
  85. yolov5_obb-master/models/yolov5x.yaml 1.37KB
  86. yolov5_obb-master/models/__init__.py
  87. yolov5_obb-master/models/hub/anchors.yaml 3.26KB
  88. yolov5_obb-master/models/hub/yolov3-spp.yaml 1.53KB
  89. yolov5_obb-master/models/hub/yolov3-tiny.yaml 1.2KB
  90. yolov5_obb-master/models/hub/yolov3.yaml 1.52KB
  91. yolov5_obb-master/models/hub/yolov5-bifpn.yaml 1.39KB
  92. yolov5_obb-master/models/hub/yolov5-fpn.yaml 1.19KB
  93. yolov5_obb-master/models/hub/yolov5-p2.yaml 1.62KB
  94. yolov5_obb-master/models/hub/yolov5-p6.yaml 1.66KB
  95. yolov5_obb-master/models/hub/yolov5-p7.yaml 2.03KB
  96. yolov5_obb-master/models/hub/yolov5-panet.yaml 1.37KB
  97. yolov5_obb-master/models/hub/yolov5l6.yaml 1.78KB
  98. yolov5_obb-master/models/hub/yolov5m6.yaml 1.78KB
  99. yolov5_obb-master/models/hub/yolov5n6.yaml 1.78KB
  100. yolov5_obb-master/models/hub/yolov5s-ghost.yaml 1.45KB
  101. yolov5_obb-master/models/hub/yolov5s-transformer.yaml 1.41KB
  102. yolov5_obb-master/models/hub/yolov5s6.yaml 1.78KB
  103. yolov5_obb-master/models/hub/yolov5x6.yaml 1.78KB
  104. yolov5_obb-master/sh/ddp_train.sh 1.58KB
  105. yolov5_obb-master/tools/TestJson2VocClassTxt.py 2.3KB
  106. yolov5_obb-master/tools/Xml2Txt.py 2.53KB
  107. yolov5_obb-master/utils/activations.py 3.69KB
  108. yolov5_obb-master/utils/augmentations.py 11.98KB
  109. yolov5_obb-master/utils/autoanchor.py 8.68KB
  110. yolov5_obb-master/utils/autobatch.py 2.13KB
  111. yolov5_obb-master/utils/callbacks.py 2.34KB
  112. yolov5_obb-master/utils/datasets.py 48.49KB
  113. yolov5_obb-master/utils/downloads.py 6.13KB
  114. yolov5_obb-master/utils/general.py 39.81KB
  115. yolov5_obb-master/utils/loss.py 13.12KB
  116. yolov5_obb-master/utils/metrics.py 13.75KB
  117. yolov5_obb-master/utils/plots.py 24.2KB
  118. yolov5_obb-master/utils/rboxs_utils.py 6.98KB
  119. yolov5_obb-master/utils/torch_utils.py 13.14KB
  120. yolov5_obb-master/utils/__init__.py 1.11KB
  121. yolov5_obb-master/utils/aws/mime.sh 780B
  122. yolov5_obb-master/utils/aws/resume.py 1.17KB
  123. yolov5_obb-master/utils/aws/userdata.sh 1.22KB
  124. yolov5_obb-master/utils/aws/__init__.py
  125. yolov5_obb-master/utils/flask_rest_api/example_request.py 299B
  126. yolov5_obb-master/utils/flask_rest_api/README.md 1.67KB
  127. yolov5_obb-master/utils/flask_rest_api/restapi.py 1.05KB
  128. yolov5_obb-master/utils/google_app_engine/additional_requirements.txt 105B
  129. yolov5_obb-master/utils/google_app_engine/app.yaml 174B
  130. yolov5_obb-master/utils/google_app_engine/Dockerfile 821B
  131. yolov5_obb-master/utils/loggers/__init__.py 7.7KB
  132. yolov5_obb-master/utils/loggers/wandb/log_dataset.py 1.01KB
  133. yolov5_obb-master/utils/loggers/wandb/README.md 10.57KB
  134. yolov5_obb-master/utils/loggers/wandb/sweep.py 1.12KB
  135. yolov5_obb-master/utils/loggers/wandb/sweep.yaml 2.41KB
  136. yolov5_obb-master/utils/loggers/wandb/wandb_utils.py 26.46KB
  137. yolov5_obb-master/utils/loggers/wandb/__init__.py
  138. yolov5_obb-master/utils/loggers/wandb/__pycache__/wandb_utils.cpython-39.pyc 19.17KB
  139. yolov5_obb-master/utils/loggers/wandb/__pycache__/__init__.cpython-39.pyc 158B
  140. yolov5_obb-master/utils/nms_rotated/nms_rotated_wrapper.py 2.74KB
  141. yolov5_obb-master/utils/nms_rotated/setup.py 1.67KB
  142. yolov5_obb-master/utils/nms_rotated/__init__.py 86B
  143. yolov5_obb-master/utils/nms_rotated/src/box_iou_rotated_utils.h 10.34KB
  144. yolov5_obb-master/utils/nms_rotated/src/nms_rotated_cpu.cpp 2.31KB
  145. yolov5_obb-master/utils/nms_rotated/src/nms_rotated_cuda.cu 4.6KB
  146. yolov5_obb-master/utils/nms_rotated/src/nms_rotated_ext.cpp 1.61KB
  147. yolov5_obb-master/utils/nms_rotated/src/poly_nms_cpu.cpp 140B
  148. yolov5_obb-master/utils/nms_rotated/src/poly_nms_cuda.cu 8.46KB

资源介绍:

yolov5+csl标签.(Oriented Object Detection)(Rotation Detection)(Rotated BBox)基于yolov5的旋转目标检测_yolo
馃摎 This guide explains how to use **Weights & Biases** (W&B) with YOLOv5 馃殌. UPDATED 29 September 2021. * [About Weights & Biases](#about-weights-&-biases) * [First-Time Setup](#first-time-setup) * [Viewing runs](#viewing-runs) * [Disabling wandb](#disabling-wandb) * [Advanced Usage: Dataset Versioning and Evaluation](#advanced-usage) * [Reports: Share your work with the world!](#reports) ## About Weights & Biases Think of [W&B](https://wandb.ai/site?utm_campaign=repo_yolo_wandbtutorial) like GitHub for machine learning models. With a few lines of code, save everything you need to debug, compare and reproduce your models 鈥� architecture, hyperparameters, git commits, model weights, GPU usage, and even datasets and predictions. Used by top researchers including teams at OpenAI, Lyft, Github, and MILA, W&B is part of the new standard of best practices for machine learning. How W&B can help you optimize your machine learning workflows: * [Debug](https://wandb.ai/wandb/getting-started/reports/Visualize-Debug-Machine-Learning-Models--VmlldzoyNzY5MDk#Free-2) model performance in real time * [GPU usage](https://wandb.ai/wandb/getting-started/reports/Visualize-Debug-Machine-Learning-Models--VmlldzoyNzY5MDk#System-4) visualized automatically * [Custom charts](https://wandb.ai/wandb/customizable-charts/reports/Powerful-Custom-Charts-To-Debug-Model-Peformance--VmlldzoyNzY4ODI) for powerful, extensible visualization * [Share insights](https://wandb.ai/wandb/getting-started/reports/Visualize-Debug-Machine-Learning-Models--VmlldzoyNzY5MDk#Share-8) interactively with collaborators * [Optimize hyperparameters](https://docs.wandb.com/sweeps) efficiently * [Track](https://docs.wandb.com/artifacts) datasets, pipelines, and production models ## First-Time Setup <details open> <summary> Toggle Details </summary> When you first train, W&B will prompt you to create a new account and will generate an **API key** for you. If you are an existing user you can retrieve your key from https://wandb.ai/authorize. This key is used to tell W&B where to log your data. You only need to supply your key once, and then it is remembered on the same device. W&B will create a cloud **project** (default is 'YOLOv5') for your training runs, and each new training run will be provided a unique run **name** within that project as project/name. You can also manually set your project and run name as: ```shell $ python train.py --project ... --name ... ``` YOLOv5 notebook example: <a href="https://colab.research.google.com/github/ultralytics/yolov5/blob/master/tutorial.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"></a> <a href="https://www.kaggle.com/ultralytics/yolov5"><img src="https://kaggle.com/static/images/open-in-kaggle.svg" alt="Open In Kaggle"></a> <img width="960" alt="Screen Shot 2021-09-29 at 10 23 13 PM" src="https://user-images.githubusercontent.com/26833433/135392431-1ab7920a-c49d-450a-b0b0-0c86ec86100e.png"> </details> ## Viewing Runs <details open> <summary> Toggle Details </summary> Run information streams from your environment to the W&B cloud console as you train. This allows you to monitor and even cancel runs in <b>realtime</b> . All important information is logged: * Training & Validation losses * Metrics: Precision, Recall, mAP@0.5, mAP@0.5:0.95 * Learning Rate over time * A bounding box debugging panel, showing the training progress over time * GPU: Type, **GPU Utilization**, power, temperature, **CUDA memory usage** * System: Disk I/0, CPU utilization, RAM memory usage * Your trained model as W&B Artifact * Environment: OS and Python types, Git repository and state, **training command** <p align="center"><img width="900" alt="Weights & Biases dashboard" src="https://user-images.githubusercontent.com/26833433/135390767-c28b050f-8455-4004-adb0-3b730386e2b2.png"></p> </details> ## Disabling wandb * training after running `wandb disabled` inside that directory creates no wandb run ![Screenshot (84)](https://user-images.githubusercontent.com/15766192/143441777-c780bdd7-7cb4-4404-9559-b4316030a985.png) * To enable wandb again, run `wandb online` ![Screenshot (85)](https://user-images.githubusercontent.com/15766192/143441866-7191b2cb-22f0-4e0f-ae64-2dc47dc13078.png) ## Advanced Usage You can leverage W&B artifacts and Tables integration to easily visualize and manage your datasets, models and training evaluations. Here are some quick examples to get you started. <details open> <h3> 1: Train and Log Evaluation simultaneousy </h3> This is an extension of the previous section, but it'll also training after uploading the dataset. <b> This also evaluation Table</b> Evaluation table compares your predictions and ground truths across the validation set for each epoch. It uses the references to the already uploaded datasets, so no images will be uploaded from your system more than once. <details open> <summary> <b>Usage</b> </summary> <b>Code</b> <code> $ python train.py --upload_data val</code> ![Screenshot from 2021-11-21 17-40-06](https://user-images.githubusercontent.com/15766192/142761183-c1696d8c-3f38-45ab-991a-bb0dfd98ae7d.png) </details> <h3>2. Visualize and Version Datasets</h3> Log, visualize, dynamically query, and understand your data with <a href='https://docs.wandb.ai/guides/data-vis/tables'>W&B Tables</a>. You can use the following command to log your dataset as a W&B Table. This will generate a <code>{dataset}_wandb.yaml</code> file which can be used to train from dataset artifact. <details> <summary> <b>Usage</b> </summary> <b>Code</b> <code> $ python utils/logger/wandb/log_dataset.py --project ... --name ... --data .. </code> ![Screenshot (64)](https://user-images.githubusercontent.com/15766192/128486078-d8433890-98a3-4d12-8986-b6c0e3fc64b9.png) </details> <h3> 3: Train using dataset artifact </h3> When you upload a dataset as described in the first section, you get a new config file with an added `_wandb` to its name. This file contains the information that can be used to train a model directly from the dataset artifact. <b> This also logs evaluation </b> <details> <summary> <b>Usage</b> </summary> <b>Code</b> <code> $ python train.py --data {data}_wandb.yaml </code> ![Screenshot (72)](https://user-images.githubusercontent.com/15766192/128979739-4cf63aeb-a76f-483f-8861-1c0100b938a5.png) </details> <h3> 4: Save model checkpoints as artifacts </h3> To enable saving and versioning checkpoints of your experiment, pass `--save_period n` with the base cammand, where `n` represents checkpoint interval. You can also log both the dataset and model checkpoints simultaneously. If not passed, only the final model will be logged <details> <summary> <b>Usage</b> </summary> <b>Code</b> <code> $ python train.py --save_period 1 </code> ![Screenshot (68)](https://user-images.githubusercontent.com/15766192/128726138-ec6c1f60-639d-437d-b4ee-3acd9de47ef3.png) </details> </details> <h3> 5: Resume runs from checkpoint artifacts. </h3> Any run can be resumed using artifacts if the <code>--resume</code> argument starts with聽<code>wandb-artifact://</code>聽prefix followed by the run path, i.e,聽<code>wandb-artifact://username/project/runid </code>. This doesn't require the model checkpoint to be present on the local system. <details> <summary> <b>Usage</b> </summary> <b>Code</b> <code> $ python train.py --resume wandb-artifact://{run_path} </code> ![Screenshot (70)](https://user-images.githubusercontent.com/15766192/128728988-4e84b355-6c87-41ae-a591-14aecf45343e.png) </details> <h3> 6: Resume runs from dataset artifact & checkpoint artifacts. </h3> <b> Local dataset or model checkpoints are not required. This can be used to resume runs directly on a different device </b> The syntax is same as the previous section, but you'll need to lof both the dataset and model checkpoints as artifacts, i.e, set bot <code>--upload_dataset<
100+评论
captcha
    类型标题大小时间
    ZIPPFC5.0,6.0花岗岩单轴GBM,可定义矿物种类,含量,预制孔隙/裂隙单轴压缩实验,孔隙,裂隙可直接CAD导入,可监测应力应变曲线,裂纹数量和种类代码百分百正常运行,有中文备注,对于后添加的功能493.2KB3月前
    ZIP基于yolov8-firedetection的火灾探测部署.zip19.81MB3月前
    ZIP全国各省Kml边界,WGS84格式11.35MB3月前
    ZIP复制leveldb的主要目的是学习LSM-Tree的具体实现,提高C++水平 将附上具体的实施文件,以便更好地阅读项目(以及理解leveldb的实施)-xdb LSM树.zip7.47MB3月前
    ZIP基于MATLAB的人体目标检测主要调用MATLAB自带的yolov3对人体检测546.44KB3月前
    ZIP博世汽车电驱仿真模型,同步电机和异步电机模型,相电流完美波形 博世汽车电驱仿真模型,同步电机和异步电机模型,相电流完美波形,自动计算弱磁模型调用各种脚本进行foc控制,正反转切电流无波动,由于模型特殊139.75KB3月前
    ZIP单相逆变器仿真模型 电压电流双闭环双闭环PI控制 LC滤波 SPWM调制输出交流电压220V 50Hz图2为模型输出电压电流 功率波形Matlab Simulink155.83KB3月前
    ZIP双有源桥式dcdc变器仿真dab变器Matlab仿真模型自行设计输入输出电压值配基础讲解一份255.08KB3月前