Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sourcery Starbot ⭐ refactored huyhoang17/DB_text_minimal #26

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

SourceryAI
Copy link

Thanks for starring sourcery-ai/sourcery ✨ 🌟 ✨

Here's your pull request refactoring your most popular Python repo.

If you want Sourcery to refactor all your Python repos and incoming pull requests install our bot.

Review changes via command line

To manually merge these changes, make sure you're on the master branch, then run:

git fetch https://github.com/sourcery-ai-bot/DB_text_minimal master
git merge --ff-only FETCH_HEAD
git reset HEAD^

Comment on lines -59 to +61
augment_seq = iaa.Sequential([
iaa.Fliplr(0.5),
iaa.Affine(rotate=(-10, 10)),
iaa.Resize((0.5, 3.0))
])
return augment_seq
return iaa.Sequential(
[iaa.Fliplr(0.5), iaa.Affine(rotate=(-10, 10)), iaa.Resize((0.5, 3.0))]
)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function BaseDatasetIter._get_default_augment refactored with the following changes:

Comment on lines -184 to +181
gt_fn = "gt_img{}.txt".format(img_id)
gt_fn = f"gt_img{img_id}.txt"
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function TotalTextDatasetIter.load_metadata refactored with the following changes:

@@ -195,7 +192,6 @@ def load_all_anns(self, gt_paths):
lines = []
reader = open(gt, 'r').readlines()
for line in reader:
item = {}
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function TotalTextDatasetIter.load_all_anns refactored with the following changes:

Comment on lines -223 to +218
gt_fn = "{}.txt".format(img_id)
gt_fn = f"{img_id}.txt"
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function CTW1500DatasetIter.load_metadata refactored with the following changes:

@@ -237,7 +232,6 @@ def load_all_anns(self, gt_fps):
lines = []
with open(gt_fp, 'r') as f:
for line in f:
item = {}
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function CTW1500DatasetIter.load_all_anns refactored with the following changes:

Comment on lines -374 to -380
methodMetrics = {
return {
'precision': methodPrecision,
'recall': methodRecall,
'hmean': methodHmean
'hmean': methodHmean,
}

return methodMetrics
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function DetectionDetEvalEvaluator.combine_results refactored with the following changes:

Comment on lines -394 to +392
args = parser.parse_args()
return args
return parser.parse_args()
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function load_args refactored with the following changes:

Comment on lines -426 to +423
results = []
for gt, pred in zip(gts, preds):
results.append(evaluator.evaluate_image(gt, pred))
results = [evaluator.evaluate_image(gt, pred) for gt, pred in zip(gts, preds)]
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lines 426-428 refactored with the following changes:

iou = get_intersection(pD, pG) / get_union(pD, pG)
return iou
return get_intersection(pD, pG) / get_union(pD, pG)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function DetectionIoUEvaluator.evaluate_image.get_intersection_over_union refactored with the following changes:


def get_intersection(pD, pG):
pD = Polygon(pD).buffer(0)
pG = Polygon(pG).buffer(0)
return pD.intersection(pG).area

def compute_ap(confList, matchList, numGtCare):
correct = 0
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function DetectionIoUEvaluator.evaluate_image.compute_ap refactored with the following changes:

Comment on lines -208 to -214
methodMetrics = {
return {
'precision': methodPrecision,
'recall': methodRecall,
'hmean': methodHmean
'hmean': methodHmean,
}

return methodMetrics
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function DetectionIoUEvaluator.combine_results refactored with the following changes:

Comment on lines -228 to +225
args = parser.parse_args()
return args
return parser.parse_args()
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function load_args refactored with the following changes:

Comment on lines -281 to +277
results = []
# for each images
for gt, pred in zip(gts, preds):
results.append(evaluator.evaluate_image(gt, pred))
results = [evaluator.evaluate_image(gt, pred) for gt, pred in zip(gts, preds)]
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lines 281-284 refactored with the following changes:

This removes the following comments ( why? ):

# for each images

Comment on lines -38 to +40
balance_loss = (positive_loss.sum() + negative_loss.sum()) / (
no_positive + no_negative + self.eps)
return balance_loss
return (positive_loss.sum() + negative_loss.sum()) / (
no_positive + no_negative + self.eps
)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function OHEMBalanceCrossEntropyLoss.forward refactored with the following changes:

Comment on lines -77 to +80
loss = (torch.abs(pred - gt) * mask).sum() / \
(mask.sum() + self.eps)
else:
l1_loss_fn = torch.nn.L1Loss(reduction=self.reduction)
loss = l1_loss_fn(pred, gt)
return loss
return (torch.abs(pred - gt) * mask).sum() / (mask.sum() + self.eps)

l1_loss_fn = torch.nn.L1Loss(reduction=self.reduction)
return l1_loss_fn(pred, gt)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function L1Loss.forward refactored with the following changes:

print(">>> Devide: {}".format(opt.device))
print(f">>> Devide: {opt.device}")
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function main refactored with the following changes:

opt = parser.parse_args()
return opt
return parser.parse_args()
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function load_args refactored with the following changes:

args = parser.parse_args()
return args
return parser.parse_args()
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function load_args refactored with the following changes:

Comment on lines -194 to +193
result = {"pred": pred, "score": float(confidence_score)}
return result
return {"pred": pred, "score": float(confidence_score)}
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function predict refactored with the following changes:

Comment on lines -246 to +244
print(">>> Detect: {}'s".format(end))
print(f">>> Detect: {end}'s")
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function main refactored with the following changes:

This removes the following comments ( why? ):

# noqa

Comment on lines -19 to +22
hist = np.bincount(n_class * label_true[mask].astype(int) +
label_pred[mask],
minlength=n_class**2).reshape(n_class, n_class)

return hist
return np.bincount(
n_class * label_true[mask].astype(int) + label_pred[mask],
minlength=n_class**2,
).reshape(n_class, n_class)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function RunningScore._fast_hist refactored with the following changes:

@@ -29,7 +28,6 @@ def update(self, label_trues, label_preds):
lt.flatten(), lp.flatten(), self.n_classes)
except Exception as e:
print(e)
pass
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function RunningScore.update refactored with the following changes:

@@ -116,7 +114,6 @@ def measure(self, batch, output, is_output_polygon=False, box_thresh=0.6):
filename: the original filenames of images.
output: (polygons, ...)
'''
results = []
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function QuadMetric.measure refactored with the following changes:

This removes the following comments ( why? ):

# for 1 image

args = parser.parse_args()
return args
return parser.parse_args()
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function load_args refactored with the following changes:

Comment on lines -25 to +31
url = "http://{}:{}/{}/{}".format(args.host, args.port, args.mode,
args.model_name)
url = f"http://{args.host}:{args.port}/{args.mode}/{args.model_name}"
image_path = args.image_path
with open(image_path, "rb") as f:
data = f.read()

start = time.time()
resp = requests.post(url, data=data).text
print("REST took: {}'s".format(time.time() - start))
print(f"REST took: {time.time() - start}'s")
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function main refactored with the following changes:

Comment on lines -222 to +225
layers = []
layers.append(block(self.inplanes, planes, stride, downsample,
dcn=dcn))
layers = [block(self.inplanes, planes, stride, downsample, dcn=dcn)]
self.inplanes = planes * block.expansion
for i in range(1, blocks):
layers.append(block(self.inplanes, planes, dcn=dcn))

layers.extend(block(self.inplanes, planes, dcn=dcn) for _ in range(1, blocks))
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function ResNet._make_layer refactored with the following changes:

for i in range(fpem_repeat):
for _ in range(fpem_repeat):
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function FPEM_FFM.__init__ refactored with the following changes:

Fy = torch.cat([c2_ffm, c3, c4, c5], dim=1)
return Fy
return torch.cat([c2_ffm, c3, c4, c5], dim=1)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function FPEM_FFM.forward refactored with the following changes:

Comment on lines -42 to +44
y = torch.cat((shrink_maps, threshold_maps, binary_maps), dim=1)
return torch.cat((shrink_maps, threshold_maps, binary_maps), dim=1)
else:
y = torch.cat((shrink_maps, threshold_maps), dim=1)
return y
return torch.cat((shrink_maps, threshold_maps), dim=1)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function DBHead.forward refactored with the following changes:

Comment on lines -86 to +102
if smooth:
inter_out_channels = out_channels
if out_channels == 1:
inter_out_channels = in_channels
module_list = [
nn.Upsample(scale_factor=2, mode='nearest'),
nn.Conv2d(in_channels, inter_out_channels, 3, 1, 1, bias=bias)
]
if out_channels == 1:
module_list.append(
nn.Conv2d(in_channels,
out_channels,
kernel_size=1,
stride=1,
padding=1,
bias=True))
return nn.Sequential(module_list)
else:
if not smooth:
return nn.ConvTranspose2d(in_channels, out_channels, 2, 2)
inter_out_channels = out_channels
if out_channels == 1:
inter_out_channels = in_channels
module_list = [
nn.Upsample(scale_factor=2, mode='nearest'),
nn.Conv2d(in_channels, inter_out_channels, 3, 1, 1, bias=bias)
]
if out_channels == 1:
module_list.append(
nn.Conv2d(in_channels,
out_channels,
kernel_size=1,
stride=1,
padding=1,
bias=True))
return nn.Sequential(module_list)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function DBHead._init_upsample refactored with the following changes:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant